• Source:JND

Microsoft has already started integrating AI into its variety of services including Bing and changed the way people search. It seems that the tech giant is not done even now and has introduced an all-new lightweight AI model dubbed Phi-3-mini for clients who are looking for cost-effective options.

As it bets its future on a technology that is anticipated to have a broad impact on the world and the way people operate, the company has launched the first of its three small language models (SLM), known as Phi-3-mini.

According to Sebastian Bubeck, Microsoft's vice president of GenAI research, "Phi-3 is not slightly cheaper, it's dramatically cheaper; we're talking about a 10x cost difference compared to the other models out there with similar capabilities."

ALSO READ: Google Find My Device Update: Now Track Your Pixel 8 And Pixel 8 Pro Even When They Are Switched Off; All About iPhone-Like Feature

According to the company, SLMs are easier for businesses with little resources to use because they are made to do fewer tasks.

Furthermore, the company stated that Phi-3-mini will be made available right away on the machine learning model platform Hugging Face, the AI model catalogue on Microsoft cloud service platform Azure, and Ollama, a framework for using models locally.

The SLM has also been optimised for Nvidia's graphics processing units (GPUs) and will be accessible via the company's software platform, Nvidia Inference Microservices (NIM).

Microsoft gave the UAE-based AI startup G42 $1.5 billion last week. Additionally, in the past, it collaborated with the French startup Mistral AI to make its models accessible via the Azure cloud computing platform.

ALSO READ: iQOO 12 Anniversary Edition Sale: Get Rs 3,000 Bank Offer With Flagship Specs; Check Price, Availability Here

Meanwhile, Google is working on Gemini and may soon add a variety of features including the floating window feature, Live Prompts on Gemini AI along with new files support. Not only that, the company may also offer real-time line-by-line responses to the queries. However, the features are not confirmed by Google and are not even part of testing but the reports suggest Google may add it soon.