microsoft/phi-3.5-mini-instruct
PREVIEWLightweight multilingual LLM powering AI applications in latency bound, memory/compute constrained environments
Say something like
Lightweight multilingual LLM powering AI applications in latency bound, memory/compute constrained environments