Sudipta Deb

Sudipta Deb

Founder of Technical Potpourri, Co-Founder of Shrey Tech, Enterprise Cloud Architect

Last week in Las Vegas, more than 30,000 people attended Google Cloud Next 2024, together with 2,500 partners. The company’s redoubled commitment to generative AI optimization and AI in general was evident in Google Cloud Next 2024.

The key points from the company’s premier cloud event are outlined here.

  • This week’s Google Cloud Next 2024 was concluded on Thursday by Google Cloud.
  • With 2,500 partners and over 30,000 guests, Google Cloud Next 2024 had a 100% rise.
  • Google reaffirmed its commitment to generative AI and AI in general with Google Cloud Next 2024.

This year, Google Cloud Next was held in Las Vegas instead of its customary venue in San Francisco, as the annual cloud conference. Google Cloud Next 2024 witnessed the company’s increased commitment to generative AI and artificial intelligence (AI) in general, in addition to a new location.

“One of the reasons Cloud is showing so much progress is our deep investments in AI. We’ve known for a while that AI will be the next technology to transform companies. Our investments in AI infrastructure and models help put us at the forefront of the AI platform shift. And we’re proud that today more than 60% of funded generative AI startups, and nearly 90% of gen AI unicorns are Google Cloud customers,” noted Sundar Pichai, CEO of Google.

Ram Boreda, VP of Products at Egnyte, attended Google Cloud Next 2024 in Las Vegas and attests to Google Cloud’s success with startups. “The startup ecosystem leveraging GCP has grown tremendously,” he told Spiceworks News & Insights. “The demo grounds had more vendors and serious products. This shows that GCP has established itself as a leader in the new wave with AI.”

“Every product in Google’s catalog had an AI uplift,” Boreda noted. “Announcements were made for every product portfolio – modern Infra Cloud, Developer Cloud, Data Cloud, Security Cloud, Collaboration Cloud.”

AI Infrastructure

Axion central processing units, Google’s answer to AWS’s Graviton, Alibaba’s Yitian 710 server, Microsoft’s Azure Maia 100 and Cobalt 100 chips, and Alibaba are currently part of Google’s infrastructure optimization efforts for AI.

While we know that Axiom uses Arm’s Neoverse 2, benchmarking information, architectural specifics, and other technical documentation regarding Google’s proprietary Arm-based silicon for cloud infrastructure are still unknown.

Google claims that Axiom processors, which are intended for general-purpose computing, deliver 60% greater performance and are 60% more energy efficient than comparable X86 chips. Additionally, the company claimed that Axiom outperforms Arm-based processors made by Microsoft and AWS by 30%.

Boreda stated, “Google is catching up to AWS and Microsoft, who have chips that are already built on the Arm architecture. All of this results in a growing trend in costs.

The most potent tensor processing unit from Google, the TPU v5p, which can grow to tens of thousands of chips, was also made generally available. 8,960 chips make up a single v5p pod, which is nearly twice as many as a TPU v4. In addition, TPY v5p offers four times greater scalability, three times more high-bandwidth memory, and twice as many floating point operations per second.

With Google TPUs, NVIDIA GPUs (H100), NVIDIA GB200 support from 2025, AI-optimized storage (Hyperdisk ML), dynamic workload management [Dynamic Workload Scheduler], and Google Axion ARM processor for datacenter, among other things, Boreda stated, “AI Infrastructure gets a big boost.”

Vertex AI Updates

First up, clients may create and use AI agents with the Vertex AI Agent Builder, a no-code service. It integrates Google Search, developer tools, and several major language models.

The business has decreased hallucinations and grounded results (i.e., displayed results sourced from reputable sources) by implementing vector search and retrieval augmented generation (RAG). Gemini’s extensive language models are utilized by Vertex AI Agent Builder. Google demonstrated a number of use cases, such as contract management and retail.

More AI models are also available in Vertex AI today, such as the new Claude 3 family of AI models from Anthropic, Gemini 1.5 Pro, CodeGemma (Google’s coding aid model), Imagen 2.0 (text-to-image technology), and others.

Notably, the public preview of Gemini 1.5 Pro is now available. Its multimodal context window increases to about a million tokens.

Workspace AI Updates

More AI features are being added to Google Workspace, the productivity package that billions of people use on a daily basis. Among them are a few of them:

Gmail voice suggestions to write emails, improving drafts
Spreadsheet templates and change alerts
Improved Docs organization features
More significantly, Google Vids, a new AI-powered video creation service, is coming to Workspace. Vids is designed to work inside Docs and Sheets for a collaborative experience to meet organizational objectives, including producing explainer films, product decks, etc. It is currently available for restricted testing.

Google plans to counter Microsoft’s Copilot initiative by integrating Gemini into all of its app stack products. 

Hyperdisk Storage Pools

Google’s Hyperdisk Storage Pool, a block storage service designed to achieve cost and computation efficiency, is one of the less publicized improvements. It makes it possible for developers to employ storage pooling for various workloads, including AI inferencing.

According to Google, “Hyperdisk ML offers cost efficiency through read-only, multi-attach, and thin provisioning and accelerates model load times up to 12X compared to common alternatives.” “It offers up to 1.2 TiB/s of aggregate throughput per volume and allows up to 2,500 instances to access the same volume—over 100X greater performance than Microsoft Azure Ultra SSD and Amazon EBS io2 BlockExpress.”

Hyperdisk Storage Pools let users make efficient use of their resources while lowering their overall cost of ownership and management.

Conclusion

As we bid farewell to Google Cloud Next 2024, a sense of awe and inspiration lingers in the air. The last day of this groundbreaking event has come to a close, leaving us with a treasure trove of new announcements and insights to digest and explore.

From the unveiling of cutting-edge technologies to the deep dives into the future of cloud computing, each day of Google Cloud Next has been a testament to the boundless innovation and ingenuity that define the tech industry. As we reflect on the wealth of knowledge gained and the connections made, it’s clear that the impact of this event will reverberate far beyond the walls of the Mandalay Bay Convention Center.

But our journey doesn’t end here. As we move forward, armed with the latest tools and insights from Google Cloud Next, we embark on a new chapter of discovery and growth. The innovations unveiled during this event will continue to shape the way we work, collaborate, and innovate in the years to come.

So, as we say goodbye to Google Cloud Next 2024, let us carry forward the spirit of curiosity, creativity, and collaboration that has defined these unforgettable days. And remember, the future of technology is bright, and with each new discovery, we inch closer to realizing its full potential.

Until we meet again, may your dreams be bold, your aspirations lofty, and your journey through the world of technology filled with endless possibilities. Farewell, Google Cloud Next 2024 – until next time!

Disclaimer

This article is not endorsed by Salesforce, Google, or any other company in any way. I shared my knowledge on this topic in this blog post. Please always refer to Official Documentation for the latest information.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *