Connect with us

Technology News

Mukesh Ambani joins hands with Google and Meta to build India’s AI backbone what he revealed at Reliance AGM

Mukesh Ambani, India’s richest man, unveils Reliance Intelligence, a new subsidiary partnering with Google Cloud and Meta to power India’s AI revolution.

Published

on

Mukesh Ambani teams with Google and Meta to launch Reliance Intelligence AI backbone
Mukesh Ambani unveils Reliance Intelligence with Google and Meta to power India’s AI future.

India is stepping boldly into the artificial intelligence race, and the push is being led by none other than Mukesh Ambani. At the 48th annual general meeting of Reliance Industries, Ambani announced the launch of Reliance Intelligence, a new subsidiary designed to become the country’s AI backbone — powered by strategic partnerships with Google and Meta.

ALSO READ : Dutch firm Amdax raises $23M to chase 1% of Bitcoin supply what it means for global markets

Ambani told shareholders that Reliance Intelligence would become a hub for “world-class researchers, engineers, designers, and product builders,” combining research speed with engineering scale to deliver AI-driven solutions for India and beyond.

Building India’s AI cloud

The first step in this vision involves a deep collaboration with Google Cloud. Reliance is building a dedicated AI cloud region starting with a large data center in Jamnagar, Gujarat. This network will integrate Jio’s telecom infrastructure with Reliance’s own energy assets to deliver AI services at scale to developers, enterprises, and government institutions.

Sundar Pichai, CEO of Google, reinforced the importance of the tie-up in a video address during the AGM. “As Reliance’s largest public cloud partner, Google Cloud is not only powering mission-critical workloads, but we are also innovating with you on advanced AI initiatives. This is only the beginning,” he said.

A $100 million partnership with Meta

Reliance also confirmed a joint venture with Meta, investing a combined ₹8.55 billion (around $100 million). Reliance will hold a 70% stake, with Meta retaining 30% ownership.

The partnership brings Meta’s Llama-based enterprise AI models to Indian businesses, offering platform-as-a-service tools for sectors like sales, finance, customer service, and marketing. Pre-configured generative AI solutions will also be made available, tailored for enterprises looking to quickly adopt AI.

Mark Zuckerberg, CEO of Meta, said in a statement: “Through this joint venture, we’re putting Meta’s Llama models into real-world use.”

The deal, pending regulatory approvals, is expected to close in the fourth quarter of 2025.

Expanding beyond India

Ambani hinted at even bigger plans. Reliance Jio, the telecom and digital subsidiary, will enter international markets, with an initial public offering (IPO) expected in the first half of 2026.

Reliance is also reportedly in talks with OpenAI to collaborate on consumer AI services in India. According to sources, details may be revealed during Sam Altman’s upcoming visit to New Delhi next month.

This comes as Reliance strengthens existing partnerships — from Microsoft for Azure Cloud to consumer-facing products like JioAICloud, which already serves 40 million users with storage and AI content tools.

Jio’s AI vision for consumers

At the AGM, Ambani showcased how Reliance is bringing AI directly to consumers. This included:

  • JioFrames, AI-powered smart glasses positioned against Ray-Ban Meta glasses and Snap Spectacles.
  • JioHotstar, which has surpassed 600 million users since its February relaunch, now featuring an AI voice assistant named “Riya.”
  • New AI translation tools capable of lip-synced video dubbing across Indian languages.

Meanwhile, rival Bharti Airtel has already partnered with AI search startup Perplexity, giving 360 million Airtel subscribers free access to Perplexity Pro. The battle for AI supremacy in India is now a two-horse race — Ambani’s Reliance versus Airtel’s emerging alliances.

A national ambition

By uniting with Google and Meta, Reliance is positioning itself not only as India’s telecom leader but also as a global AI powerhouse. Ambani’s play is clear: build infrastructure, attract talent, and turn India into an AI superpower rivaling the U.S. and China.

Whether Reliance Intelligence achieves its goal will depend on execution and regulatory clarity, but one thing is certain — Ambani has fired the opening shot in India’s AI revolution.

Visit our site for more news www.DailyGlobalDiary.com

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Technology News

Inside the Vision of the Man Who Trusts Dogs to Tell Stories on the Big Screen

From AI labs to film sets, BARK innovation chief Mikkel Holm has a radical idea — what if dogs weren’t just stars, but storytellers?

Published

on

By

Meet the Man Who Thinks Dogs Should Be Film Directors | Daily Global Diary

In an era where artificial intelligence is already writing scripts, composing music, and generating entire films, one creative mind is asking a question that feels equal parts absurd and oddly profound: Why shouldn’t dogs be directors?

That mind belongs to Mikkel Holm, the Chief AI & Innovation Officer at BARK, the pet brand best known for turning dog culture into a billion-dollar business. Holm isn’t pitching a gimmick. He’s questioning how creativity itself is defined — and who gets to own it.

From Fetch to Final Cut

Holm’s thinking sits at the crossroads of AI, storytelling, and animal behavior. With generative tools becoming more intuitive, he believes creativity no longer needs to start with a human idea. A dog’s reactions — what excites them, what scares them, what keeps their attention — could become the raw data that shapes narratives.

“Dogs already tell us what they like,” Holm has suggested in industry conversations. “We just haven’t been listening in a cinematic way.”

ALSO READ : Younghoe Koo Explains Botched Field Goal After Slip: “The Ball Was Moving So I Pulled Up”

Using sensors, computer vision, and behavioral AI models, a dog’s gaze, movement, or excitement could guide editing decisions, pacing, or even story arcs. The result wouldn’t be about dogs — it would be cinema filtered through a non-human perspective.

The Birth of the First Park Chan-Woof?

Holm jokingly refers to the possibility of minting the next Park Chan-wook — except this auteur would wag instead of walk the red carpet. The joke lands because it highlights something serious: great directors don’t just tell stories, they feel them. And dogs, arguably, are pure instinct.

Unlike human creators shaped by trends, algorithms, or box-office anxiety, dogs respond honestly. They don’t care about three-act structures or Rotten Tomatoes scores. They react in real time — and Holm believes that authenticity is something modern storytelling desperately needs.

Meet the Man Who Thinks Dogs Should Be Directors 
The Chief AI & Innovation Officer of BARK, Mikkel Holm, has a few ideas for minting the next Park Chan-woof.


Why BARK Is the Perfect Place for This Idea

At BARK, data about canine behavior isn’t abstract. It’s central to the business. Millions of interactions — toys chewed, treats rejected, boxes loved — already inform product design. Translating that behavioral intelligence into creative output feels like a natural extension.

Holm’s role isn’t about replacing human creators. Instead, it’s about collaboration — humans setting the framework, AI translating signals, and dogs influencing the final creative choices in ways we’ve never seen before.

Is This Art or Absurdity?

Skeptics, of course, will laugh. Dogs as directors sounds like a headline built for clicks. But then again, so did AI-written novels, virtual influencers, and fully synthetic pop stars — until they weren’t jokes anymore.

Holm’s idea taps into a deeper cultural shift: creativity is no longer exclusively human. As tools evolve, authorship becomes shared — between humans, machines, and perhaps, one day, animals.

And if the result is strange, emotional, or unexpectedly beautiful? That might be the point.

A Future Where Creativity Isn’t Just Human

Cinema has always evolved with technology — from silent films to sound, black-and-white to color, analog to digital. Holm’s vision suggests the next leap might not be technical, but philosophical.

What happens when we stop asking who is allowed to create?

If the first dog-directed short film ever premieres at a festival someday, don’t be surprised if it doesn’t explain itself. Dogs, after all, have never felt the need to justify their instincts. Maybe storytellers shouldn’t either.

Continue Reading

Technology News

Meet the Man Who Wants Dogs in the Director’s Chair and Thinks Cinema Needs a Bark Side

From AI labs to film sets, BARK innovation chief Mikkel Holm has a radical idea — what if dogs weren’t just stars, but storytellers?

Published

on

By

Meet the Man Who Thinks Dogs Should Be Film Directors | Daily Global Diary
A playful yet provocative idea: Can canine instincts and AI collaboration reshape the future of filmmaking?

In an era where artificial intelligence is already writing scripts, composing music, and generating entire films, one creative mind is asking a question that feels equal parts absurd and oddly profound: Why shouldn’t dogs be directors?

That mind belongs to Mikkel Holm, the Chief AI & Innovation Officer at BARK, the pet brand best known for turning dog culture into a billion-dollar business. Holm isn’t pitching a gimmick. He’s questioning how creativity itself is defined — and who gets to own it.

From Fetch to Final Cut

Holm’s thinking sits at the crossroads of AI, storytelling, and animal behavior. With generative tools becoming more intuitive, he believes creativity no longer needs to start with a human idea. A dog’s reactions — what excites them, what scares them, what keeps their attention — could become the raw data that shapes narratives.

“Dogs already tell us what they like,” Holm has suggested in industry conversations. “We just haven’t been listening in a cinematic way.”

ALSO READ : Younghoe Koo Explains Botched Field Goal After Slip: “The Ball Was Moving So I Pulled Up”

Using sensors, computer vision, and behavioral AI models, a dog’s gaze, movement, or excitement could guide editing decisions, pacing, or even story arcs. The result wouldn’t be about dogs — it would be cinema filtered through a non-human perspective.

The Birth of the First Park Chan-Woof?

Holm jokingly refers to the possibility of minting the next Park Chan-wook — except this auteur would wag instead of walk the red carpet. The joke lands because it highlights something serious: great directors don’t just tell stories, they feel them. And dogs, arguably, are pure instinct.

Unlike human creators shaped by trends, algorithms, or box-office anxiety, dogs respond honestly. They don’t care about three-act structures or Rotten Tomatoes scores. They react in real time — and Holm believes that authenticity is something modern storytelling desperately needs.

Meet the Man Who Thinks Dogs Should Be Directors 
The Chief AI & Innovation Officer of BARK, Mikkel Holm, has a few ideas for minting the next Park Chan-woof.


Why BARK Is the Perfect Place for This Idea

At BARK, data about canine behavior isn’t abstract. It’s central to the business. Millions of interactions — toys chewed, treats rejected, boxes loved — already inform product design. Translating that behavioral intelligence into creative output feels like a natural extension.

Holm’s role isn’t about replacing human creators. Instead, it’s about collaboration — humans setting the framework, AI translating signals, and dogs influencing the final creative choices in ways we’ve never seen before.

Is This Art or Absurdity?

Skeptics, of course, will laugh. Dogs as directors sounds like a headline built for clicks. But then again, so did AI-written novels, virtual influencers, and fully synthetic pop stars — until they weren’t jokes anymore.

Holm’s idea taps into a deeper cultural shift: creativity is no longer exclusively human. As tools evolve, authorship becomes shared — between humans, machines, and perhaps, one day, animals.

And if the result is strange, emotional, or unexpectedly beautiful? That might be the point.

A Future Where Creativity Isn’t Just Human

Cinema has always evolved with technology — from silent films to sound, black-and-white to color, analog to digital. Holm’s vision suggests the next leap might not be technical, but philosophical.

What happens when we stop asking who is allowed to create?

If the first dog-directed short film ever premieres at a festival someday, don’t be surprised if it doesn’t explain itself. Dogs, after all, have never felt the need to justify their instincts. Maybe storytellers shouldn’t either.

Continue Reading

Technology News

Why Is a Giant Emoji Staring at F1 Cars in Las Vegas? Sphere’s Orbi Sparks Wild Curiosity…

The massive yellow face watching the Las Vegas Grand Prix isn’t just a gimmick — Sphere’s Orbi is tracking real Formula 1 cars live using GPS technology.

Published

on

By

Sphere’s Orbi Tracks F1 Cars Live During Las Vegas Grand Prix — Here’s How
Orbi at the Sphere follows real Formula One cars during the Las Vegas Grand Prix using live GPS data.

If you thought the dazzling lights of Las Vegas were already impossible to ignore, this weekend’s Las Vegas Grand Prix took spectacle to a whole new level. Standing tall over the Strip, the world-famous Sphere — a global social media magnet since its opening — decided to become an active Formula 1 spectator. And the star of the show was not a driver, but a giant yellow emoji named Orbi.

During Saturday night’s high-octane race, F1 fans noticed something unusual: Orbi’s enormous animated eyes kept following race cars — turning left, right, up, down — almost like a living spectator. Many assumed it was just clever animation. But the truth is far more surprising.

Orbi is actually tracking real F1 drivers live, in real time.

According to Sphere executives, Orbi’s movements are powered by a technological model built specifically for the race. The system receives continuous GPS data directly from Formula One race headquarters in Biggin Hill, near London — the same hub used for global race monitoring and broadcast feeds.

“We wanted Orbi to be an F1 fan with everyone else,” one Sphere spokesperson said. “His eyes aren’t just looking around — they’re synced to the position of any car on the track. He’s literally watching the race with us.”

ALSO READ : Harvard Opens New Probe Into Jeffrey Epstein Ties as Larry Summers Steps Back From Public Roles

The result? A landmark collision between sports, tech, engineering, and digital art.

A 366-Foot-Tall Emoji Becomes the World’s Newest F1 Fan

Sphere has already earned international attention for its jaw-dropping LED surface — a screen so large it can be seen from planes, freeways, hotels, and even from space. But using Orbi to visually follow speeding race cars added something deeper: personality.

At times, Orbi appeared stressed during overtakes, shocked during near collisions, and thrilled during lead changes. Fans watching from grandstands and balconies couldn’t help but laugh — or film — the giant emoji reacting like a true racing enthusiast.

And no, Orbi doesn’t pick sides.

According to the company, every driver — whether Max Verstappen, Lewis Hamilton, Charles Leclerc or a rookie — holds the same spot in Orbi’s yellow heart.

Why Did Sphere Do This?

The answer lies in branding — but also emotion.

The venue is owned by Sphere Entertainment Co., a company built on immersive audience experiences. The Las Vegas Grand Prix presented a once-in-a-generation opportunity: a global event attended by over 300,000 people, with millions more watching worldwide.

But instead of just advertising, the Sphere team wanted connection.

Sphere’s Orbi Tracks F1 Cars Live During Las Vegas Grand Prix — Here’s How


“Everything we do at Sphere is meant to move people emotionally,” the spokesperson said. “People love Orbi. He’s not just watching — he’s participating.”

For a city fueled by entertainment, risk and reinvention, Orbi became the digital mascot fans never knew they needed.

The Internet Reacted — Predictably

Within minutes, TikTok and X were filled with videos captioned:

  • “Why is the emoji stalking Lewis Hamilton?”
  • “Orbi has better race awareness than Ferrari’s strategy team”
  • “I didn’t expect to feel emotionally supported by a building”

Even racing analysts couldn’t resist commenting on it during broadcast segments.

And just like that — Orbi became the most photographed spectator at the Grand Prix.

What This Means for the Future of Live Events

Tech experts say this moment could influence how major sporting events engage fans. Instead of static billboards or predictable LED ads, imagine:

  • Stadiums reacting to goals in real time
  • Concert venues syncing visuals to audience heart rates
  • Cities turning into live data canvases

Sphere didn’t just display content — it participated in the event.

And honestly? It worked.

Las Vegas is already known for larger-than-life entertainment, but a 162,000-square-foot emoji tracking F1 cars may be the most Vegas thing ever.

Continue Reading
Advertisement

Trending