Skip to content

Phi-1.5 implementation #14

@lurosenb

Description

@lurosenb

Phi 1.5 Summary

Phi-1.5 is a 1.3 billion parameter Transformer that outperforms some larger models. It was trained using sources similar to phi-1 (i.e. https://arxiv.org/pdf/2306.11644.pdf) plus synthetic NLP texts, and was not fine-tuned using RL. It's open-source, and ideally suited for QA, chat, and code prompts. Because it is not safeguarded, it occasionally generates irrelevant text.

Tasks

  1. Add well supported example notebook of using phi-1.5 from the hugging face implementation i.e https://huggingface.co/microsoft/phi-1_5
  2. Integrate into experimental pipeline for larger LLM effort

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions