Sign in for a personalized NXP experience.
This is a modal window.
Beginning of dialog window. Escape will cancel and close the window.
End of dialog window.
This is a modal window. This modal can be closed by pressing the Escape key or activating the close button.
This is a modal window. This modal can be closed by pressing the Escape key or activating the close button.
Sign in to access this content and additional site features.
Large Language Models, or LLMs, are behind the widespread adoption of generative AI services that take the user’s input and generate probable outputs, based on deep learning models. Fine tuning an existing model and using retrieval-augmented generation, or RAG, can yield great context aware results with reduced resource costs.
NXP’s eIQ GenAI Flow provides the tools to bring generative AI to the edge, leveraging the eIQ Neutron NPU in i.MX applications processors. eIQ GenAI Flow uses state of the art open-source AI models optimized and maintained by NXP.
Ready to learn more? Visit nxp.com/LLM