Saturday, 21 September 2024
Trending

Crypto News

SubQuery Launches Decentralized AI Inference Hosting at Web3 Summit in Berlin

SubQuery Launches Decentralized AI Inference Hosting at Web3 Summit in Berlin

Singapore, Singapore, August 21st, 2024, Chainwire

At the Web3 Summit in Berlin today, SubQuery made a major announcement, unveiling its newest innovation: decentralized AI inference hosting. In a live demonstration, SubQuery’s COO, James Bayly, showcased how the latest LLama model operates across a fully decentralized network of Node Operators on SubQuery’s internal test network.

SubQuery’s vision is to empower developers to shape the future through decentralization. The company is at the forefront of a movement to build the next wave of Web3 applications for millions of users, with decentralization as the core principle.

The SubQuery Network is an advanced infrastructure layer that underpins this vision. It currently supports decentralized data indexers and RPCs, which are critical components for any developer building decentralized applications (dApps). SubQuery has proven itself as a credible alternative to centralized services, offering an open network where anyone can participate as a node operator or delegator.

The role of AI in transforming industries, including Web3, has become increasingly clear. SubQuery has been closely monitoring these developments and has been working behind the scenes to bring AI capabilities to its decentralized platform. “The Web3 Summit in Berlin, with its focus on decentralization, is the perfect stage for us to launch this new capability and demonstrate it live” said James Bayly.

SubQuery is focused on AI inference, the process of using pre-trained models to make predictions on new data, rather than on model training. “While there are commercial services that offer inference hosting for custom models, few exist within the Web3 space,” James explained. “Our decentralized network is ideally suited for reliable, long-term AI model hosting.”

Currently, the market for AI inference is dominated by large centralized cloud providers who charge high fees and often use user data to improve their proprietary models. “Providers like OpenAI and Google Cloud AI are not only expensive but also leverage your data to enhance their closed-source offerings,” James noted. SubQuery is committed to providing an affordable, open-source alternative for hosting production AI models. “Our goal is to make it possible for users to deploy a production-ready LLM model through our network in just 10…

Click Here to Read the Full Original Article at CoinJournal: Latest Crypto News, Altcoin News and Cryptocurrency Comparison…