{"id":268629,"date":"2024-05-11T22:10:01","date_gmt":"2024-05-12T05:10:01","guid":{"rendered":"https:\/\/siliconeer.com\/current\/?p=268629"},"modified":"2024-05-11T22:10:36","modified_gmt":"2024-05-12T05:10:36","slug":"groq-to-provide-access-to-worlds-fastest-ai-inference-engine","status":"publish","type":"post","link":"https:\/\/siliconeer.com\/current\/groq-to-provide-access-to-worlds-fastest-ai-inference-engine\/","title":{"rendered":"Groq to Provide Access to World&#8217;s Fastest AI Inference Engine"},"content":{"rendered":"<p><u><a href=\"https:\/\/c212.net\/c\/link\/?t=0&amp;l=en&amp;o=4159072-1&amp;h=1902541722&amp;u=https%3A%2F%2Fgroq.com%2F&amp;a=Groq%C2%AE\" target=\"_blank\" rel=\"nofollow noopener\">Groq<\/a><\/u>, the leader in real-time AI inference, announced <u><a href=\"https:\/\/c212.net\/c\/link\/?t=0&amp;l=en&amp;o=4159072-1&amp;h=4207170177&amp;u=https%3A%2F%2Fsubmit-nairr.xras.org%2Fresources&amp;a=its+participation\" target=\"_blank\" rel=\"nofollow noopener\">its participation<\/a><\/u> in the National Artificial Intelligence Research Resource (NAIRR) Pilot, recently. The Pilot, a U.S. National Science Foundation-led program, marks the first step towards creating a shared national research infrastructure to connect U.S. researchers and educators to responsible and trustworthy AI research resources. In collaboration with 13 federal agencies and 25 private sector, nonprofit, and philanthropic organizations, Groq is powering the next phase of responsible AI research, discovery, and innovation by providing access to its LPU Inference Engine \u2013 the only solution delivering real-time AI inference today \u2013 via GroqCloud.<\/p>\n<p>Groq announces its participation in the National Artificial Intelligence Research Resource (NAIRR) Pilot.&#8221;Groq was founded, in part, to end the &#8216;haves and have-nots&#8217; in AI,&#8221; said Groq Public Sector President <span class=\"xn-person\">Aileen Black<\/span>. &#8220;Lack of access to necessary resources should never prevent a researcher from succeeding at the next Operation Warp Speed. It is an honor to provide the next generation of AI innovators with the real-time inference needed to run text-based applications and other AI workloads at scale.&#8221;<\/p>\n<p>With\u00a0GroqCloud, researchers can leverage leading open-source Large Language Models (LLMs) from providers like META, Google, and Mistral, who have built leading AI models according to industry benchmarks and human evaluation. The LPU Inference Engine makes it easy to conduct research, as well as test and deploy new generative AI applications and other AI workloads because it delivers 10x the speed while consuming just 1\/10th the energy of comparable systems using GPUs for inference. Researchers can also access the Groq technology via the <u><a href=\"https:\/\/c212.net\/c\/link\/?t=0&amp;l=en&amp;o=4159072-1&amp;h=3042569715&amp;u=https%3A%2F%2Fwww.anl.gov%2F&amp;a=ALCF+Argonne+Leadership+Compute+Facility\" target=\"_blank\" rel=\"nofollow noopener\">ALCF Argonne Leadership Compute Facility<\/a><\/u>, which includes a GroqRack compute cluster that provides an extensible accelerator network of 9 GroqNode<sup>\u00a0<\/sup>servers with a rotational multi-node network topology.<\/p>\n<p>&#8220;Inference plays an increasingly pivotal role in the AI ecosystem of computing, data, software, and platforms researchers require to advance innovation and scientific discovery in a responsible manner for the country. Groq&#8217;s contribution to the\u00a0NAIRR Pilot will enable researchers to access leading models, helping to realize their boldest research visions,&#8221; said <span class=\"xn-person\">Katie Antypas<\/span>, director of the Office of Advanced Cyberinfrastructure at the U.S. National Science Foundation.<\/p>\n<p>The\u00a0LPU Inference Engine is a new processing system developed to handle computationally intensive applications with sequential components, such as LLMs, audio, control systems, network observance, and more. While CPUs and GPUs excel in tasks related to data input and model training, they encounter challenges in executing at-scale inference in ultra-low, real-time workloads. Sub-optimal latency, throughput, power consumption, and a sequential processing nature adversely impact their effectiveness. Groq addressed these limitations when designing the LPU to ensure repeatable ultra-low latency without hindering performance.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Groq, the leader in real-time AI inference, announced its participation in the National Artificial Intelligence Research Resource (NAIRR) Pilot, recently. The Pilot, a U.S. National Science Foundation-led program, marks the first step towards creating a shared national research infrastructure to connect U.S. researchers and educators to responsible and trustworthy AI research resources. In collaboration with&#8230;<\/p>\n<div class=\"read-more-link\"><a href=\"https:\/\/siliconeer.com\/current\/groq-to-provide-access-to-worlds-fastest-ai-inference-engine\/\">Read More<\/a><\/div>\n","protected":false},"author":1,"featured_media":268630,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[9],"tags":[3275,63565,63566],"class_list":["post-268629","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-business-and-tech","tag-ai","tag-groq","tag-nairr"],"acf":[],"_links":{"self":[{"href":"https:\/\/siliconeer.com\/current\/wp-json\/wp\/v2\/posts\/268629","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/siliconeer.com\/current\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/siliconeer.com\/current\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/siliconeer.com\/current\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/siliconeer.com\/current\/wp-json\/wp\/v2\/comments?post=268629"}],"version-history":[{"count":0,"href":"https:\/\/siliconeer.com\/current\/wp-json\/wp\/v2\/posts\/268629\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/siliconeer.com\/current\/wp-json\/wp\/v2\/media\/268630"}],"wp:attachment":[{"href":"https:\/\/siliconeer.com\/current\/wp-json\/wp\/v2\/media?parent=268629"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/siliconeer.com\/current\/wp-json\/wp\/v2\/categories?post=268629"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/siliconeer.com\/current\/wp-json\/wp\/v2\/tags?post=268629"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}