From Code to Cloud: AI, Microservices & DevOps
Podcast – Ep 6 | November 9, 2025 | Duration: 30 minutes 34 seconds
Listen on your favorite platforms
Podcast Summary
In this episode of Tech Is Our Passion, Neha sits down with Bhavesh Chandra, a seasoned technology leader with over a decade of experience, to explore the shift from traditional code to intelligent, AI-powered cloud systems.
Bhavesh walks through his journey from application developer to architect of scalable microservices and AI-driven architectures. A turning point came in 2018, when he saw AI’s real-world impact in healthcare particularly in projects for early diagnosis of cataracts and myopia, where AI models achieved nearly 89% accuracy.
At the core of this episode is the synergy between microservices and AI. Bhavesh explains how microservices act as the critical bridge between databases, APIs, and AI models, enabling real-time, reliable decision-making. He shares practical examples from agentic commerce platforms, healthcare diagnostics, and AI-powered trading systems.
The conversation also tackles one of the biggest challenges in production AI: AI hallucinations. Bhavesh breaks down how hallucinations often stem from outdated or incomplete training data and why the solution lies in continuous data ingestion through microservices, combined with strong security, governance, and compliance.
On cloud strategy, Bhavesh advises choosing between AWS, GCP, and Azure based on the problem you’re solving, total cost of ownership, and your team’s existing skills, not hype or trends.
Closing on careers in the age of AI, Bhavesh answers the critical question: “Will AI take my job?” His take: AI won’t eat your job, but someone using AI will have an edge. The real opportunity is in the convergence of AI, microservices, and cloud—not to replace teams, but to build smarter systems that amplify human capability.
Podcast Transcript
From Code to Cloud – Bhavesh’s Journey
Neha:
Hi, Bhavesh. Thank you so much for joining us today. You’ve been in the tech industry for over a decade now, right? Let’s dive into your journey from code to cloud. Can you share what your early days in application development looked like and how that evolved into working with big data and now AI?
Bhavesh Chandra:
Hello, Neha, thanks for having me. Well, it all started back in 2011 before I even officially began working. I was involved with an NGO in Bangalore, which I co-founded with a few like-minded people. We were tasked with building a website, and that was my first go-to-market product. After completing my engineering, I joined Vuma Technologies, a company that worked with Indian Railways, which gave me hands-on experience with hardware and building monitoring services across regions like NCR, Telangana, Andhra Pradesh, Maharashtra, and Karnataka.
Neha:
Got it!
Read MoreLess
Bhavesh Chandra:
Yes. After that, I joined Infosys as an educator, where I spent about three years working with Angular and React. Back in 2012, we were still using JSP and servlets for front-end development. It was a time when IoT was taking off, and I connected with some like-minded individuals to explore healthcare applications. We moved into cloud services and big data technologies, which marked a major turning point in my career.
Neha:
That’s fascinating! From working with hardware to application development, cloud, and now AI. So, can you tell us what your “aha moment” was, when you realized AI wasn’t just a buzzword but something that would shape how you build applications?
The Aha Moment: Realizing AI’s Potential
Bhavesh Chandra:
Sure, that moment came in 2018 when I was involved in healthcare projects. We had IoT devices like glucometers, which allowed users to monitor their sugar levels on their phones. But the real game-changer came when my CEO visited the Bay Area and met some Israeli startups focused on AI-driven early diagnosis for cataracts and myopia. They were using AI to analyze eye pupil images to predict the likelihood of these conditions. We saw their potential, and we quickly built a pilot application and ran it with a large user base in the US. That’s when I realized the true power of AI—predicting the future.
Neha:
That’s a remarkable project! Early diagnosis of cataracts or myopia could help millions. Can you share if these AI solutions are available in India?
Bhavesh Chandra:
Yes, they are. AI is increasingly being used in healthcare, not just in India, but globally. While it’s not 100% accurate yet, AI systems today are achieving up to 89% accuracy in diagnostics, which is a huge improvement.
Neha:
89% is impressive! Early intervention can make a big difference, so this is definitely a step forward. Now, moving on to microservices. You’ve worked on microservices, which are crucial for AI-driven systems, right? Can you explain why microservices are a game-changer for AI-heavy applications?
Microservices: The Backbone of AI-Driven Systems
Bhavesh Chandra:
Let’s take a simple example. Imagine using AI tools like ChatGPT for planning a holiday. You might ask it for an itinerary, and it gives you a plan. However, if you go back after 10 days, it won’t remember your previous data. This is because we’re not working within an ecosystem.
Now, think of services like Netflix, which already have a user base and a database. By integrating AI, we can leverage their data to create predictive models. Microservices act as the bridge between databases and AI models. So when AI generates insights or predictions, those outputs are stored back in the database, and the data continuously feeds into the AI model for improved accuracy over time.
Neha:
That makes sense! Microservices act as the essential bridge between databases and AI. But how does this hold up in real-time AI applications, like fraud detection or live recommendations? Can microservices help reduce latency?
Bhavesh Chandra:
AI and microservices should always work hand-in-hand. For instance, in fraud detection, when you’re making a payment, you can’t directly expose your payment data to AI. Microservices provide a security layer, ensuring that data privacy, compliance (like HIPAA for healthcare), and security are enforced.
Neha:
Exactly. Security is a big concern. So, what about practical applications? Have you worked on any AI-driven systems that are using microservices?
Bhavesh Chandra:
Yes, I’m currently working on an agentic commerce platform. It uses AI to help plan personalized birthday events based on user preferences, even handling tasks like payments and deliveries. This type of application relies heavily on microservices for data security and personalization.
For anyone looking to integrate AI, microservices should be considered a fundamental part of your architecture.
AI Agents and Microservices: Real-Time Applications
Neha:
That sounds like a powerful platform! Can you provide tips on how to integrate AI agents with microservices in real-time applications, such as fraud detection or live recommendations?
Bhavesh Chandra:
One example I’ve worked on is integrating AI agents into existing systems. Take flight booking as an example. Traditionally, you would fill out multiple fields to find flights. But with AI, you can simply say, “I want to travel from Bangalore to Delhi on 2025 for 10 days.” The AI will automatically find the best flights, book them, and even make the payments. This is how AI can simplify complex workflows.
For teams looking to implement this, focus on understanding the data models and how to extract useful data for your AI models. This is where microservices play a key role, by helping to efficiently manage and process data.
Kubernetes, Docker & DevOps in the AI Era
Neha:
You’ve also worked with Kubernetes and Docker for scaling AI workloads. How critical are these orchestration tools in managing AI applications?
Bhavesh Chandra:
Orchestration tools like Kubernetes are vital. In the past, if one pod in your microservices cluster went down, you had to manually address it. But with Kubernetes, it automatically handles scaling, failure recovery, and load balancing. For AI, where workloads can be heavy and variable, orchestration tools are indispensable for ensuring uptime and performance.
Neha:
That’s a game-changer! Can you share an example where Kubernetes or Docker made a difference in a real-world application?
Bhavesh Chandra:
In one project, we used Kafka and Spark for data transformation. When managing large clusters, we initially had to monitor each node manually. But with Kubernetes orchestration, the system automatically scaled, preventing downtime and improving overall system efficiency.
Generative AI in Production
Neha:
With all the advancements in AI, we are also seeing challenges like AI hallucinations. How do you manage AI’s output when it produces incorrect or biased results?
Bhavesh Chandra:
Hallucinations occur when an AI model hasn’t been trained on the most up-to-date data. For example, if an AI model was last trained in 2018, it wouldn’t know about new libraries introduced since then. The solution is continuous data feeding. Microservices ensure that fresh data is always available to the AI, making it smarter and more accurate over time.
Neha:
That’s insightful! So, managing hallucinations requires keeping data updated and feeding it continuously to the AI through microservices.
Cloud Strategy: AWS, Azure, or GCP?
Neha:
Now, let’s talk about cloud. When designing an AI-first architecture, how do you decide between AWS, Azure, and GCP? Is it about features, cost, or team familiarity?
Bhavesh Chandra:
It depends on the problem you’re solving. For startups, cost is a priority, so they should choose the most cost-effective cloud provider. Security features are also crucial. As your platform grows, you might migrate to a different provider based on your evolving needs. Choose a cloud provider that aligns with your team’s expertise and your product’s requirements.
Neha:
That was a masterclass, Bhavesh. From microservices to managing AI hallucinations in Generative AI, you’ve shared invaluable lessons for tech leaders and builders. For our listeners, if you found this episode valuable, share it forward, subscribe, and keep innovating. Thank you so much, Bhavesh, for joining us today.
Bhavesh Chandra:
My pleasure, Neha. Thank you!