Model Tag:
dolphin-phi
📋
The uncensored Dolphin model, developed by Eric Hartford, is a 2.7 billion parameter model based on Microsoft's Phi language model.
License
MICROSOFT RESEARCH LICENSE
Last Updated
2024-01-04 (10 months ago)
Dolphin Phi - The Uncensored Language Model
Dolphin Phi is an uncensored LLM, engineered by Eric Hartford and Cognitive Computations. This 2.7 billion parameter uncensored model builds upon the Phi language model developed by Microsoft Research.
Dolphin Phi 2.6 is designed to harness the inherent capabilities of the 2.7B Phi model, utilizing similar datasets to those found in other variants like Dolphin Mixtral. This unique design choice enables Dolphin Phi to generate high-quality text while maintaining a focus on uncensored content, providing users with greater flexibility in their applications.
Key Features of Dolphin Phi
- Model Size: Dolphin Phi comprises 2.7 billion parameters, striking a balance between performance and computational efficiency.
- Uncensored Outputs: This model is crafted to deliver responses without the restrictions often found in other language models, enabling it to cater to a broader range of inquiries.
- Similar Datasets: By leveraging datasets akin to those used in the Phi model and other variations, Dolphin Phi ensures high-quality input for accurate and relevant output.