{"id":4043,"date":"2024-03-30T06:13:08","date_gmt":"2024-03-30T06:13:08","guid":{"rendered":"https:\/\/neptunesolution.in\/blog\/?p=4043"},"modified":"2024-03-30T06:13:08","modified_gmt":"2024-03-30T06:13:08","slug":"mobile-application-development-service-3","status":"publish","type":"post","link":"https:\/\/neptunesolution.in\/blog\/2024\/03\/30\/mobile-application-development-service-3\/","title":{"rendered":"Mobile Application Development Service"},"content":{"rendered":"\n<p>Google has released an experimental API that allows large language models to run fully on-device across Android, iOS, and web platforms.<\/p>\n\n\n\n<p>Introduced March 7, the MediaPipe LLM Inference API was designed to streamline on-device LLM integration for web developers, and supports web, Android, and iOS platforms. The API provides initial support for four LLMs:\u00a0Gemma,\u00a0Phi 2,\u00a0Falcon, and\u00a0Stable LM.<\/p>\n\n\n\n<p>Google warns that the API is experimental and still under active development, but gives researchers and developers the ability to prototype and test openly available models on-device. For Android, Google noted that production applications with LLMs can use the Gemini API or Gemini Nano on-device through Android AICore, a system-level capability introduced in Android 14 that provides Gemini-powered solutions for high-end devices including integrations with accelerators, safety filters, and LoRA adapter<\/p>\n\n\n\n<p><strong>Email us:<\/strong>\u00a0<a href=\"mailto:contact@neptunesolution.in\">contact@neptunesolution.in<\/a><\/p>\n\n\n\n<p><strong>Call:<\/strong>&nbsp;&nbsp;0172-4102740, +91-9780373638, 7495055288&nbsp;for more details.<\/p>\n\n\n\n<p><strong>Visit us:<\/strong>&nbsp;<a href=\"http:\/\/www.neptunesolution.in\/\">www.neptunesolution.in<\/a><\/p>\n\n\n\n<p><strong>Office address:<\/strong>&nbsp;&nbsp;Sector 34-A, SCO 156-157, second floor, Near Verka Corporate Office, Chandigarh \u2013 160022<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Google has released an experimental API that allows large language models to run fully on-device across Android, iOS, and web platforms. Introduced March 7, the MediaPipe LLM Inference API was designed to streamline on-device LLM integration for web developers, and supports web, Android, and iOS platforms. The API provides initial support for four LLMs:\u00a0Gemma,\u00a0Phi 2,\u00a0Falcon, [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":4040,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-4043","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-top-it-company-in-chandigarh-mohali-and-panchkula"],"_links":{"self":[{"href":"https:\/\/neptunesolution.in\/blog\/wp-json\/wp\/v2\/posts\/4043","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/neptunesolution.in\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/neptunesolution.in\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/neptunesolution.in\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/neptunesolution.in\/blog\/wp-json\/wp\/v2\/comments?post=4043"}],"version-history":[{"count":1,"href":"https:\/\/neptunesolution.in\/blog\/wp-json\/wp\/v2\/posts\/4043\/revisions"}],"predecessor-version":[{"id":4044,"href":"https:\/\/neptunesolution.in\/blog\/wp-json\/wp\/v2\/posts\/4043\/revisions\/4044"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/neptunesolution.in\/blog\/wp-json\/wp\/v2\/media\/4040"}],"wp:attachment":[{"href":"https:\/\/neptunesolution.in\/blog\/wp-json\/wp\/v2\/media?parent=4043"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/neptunesolution.in\/blog\/wp-json\/wp\/v2\/categories?post=4043"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/neptunesolution.in\/blog\/wp-json\/wp\/v2\/tags?post=4043"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}