Build and deploy a LangChain app on Google Cloud Run

Abstract: Unleash your inner AI wizard! Learn to build a fun and interactive LangChain app from scratch. Discover how to harness the power of Large Language Models (LLMs) and the LangChain open-source ecosystem with ease. We’ll guide you through building your app and deploying it on Google Cloud Run. Get ready to create something amazing and share it with the world. Join us for a fun-filled code lab!

For developers curious about exploring the GenAI space, I do feel that Gemini SDK provides a friendly environment for exploring the GenAI space. Make sure to connect to https://ai.google.dev and Google AI Studio. In contrast with OpenAI, Gemini responses have information grounding from Google search. This helps Gemini connect your users to the latest updates. Google Gemini provides a context window ranging from 100 thousand tokens to 1 million tokens. In contrast, GPT4 provides a context window of 32 thousand tokens.

Spock Chat

For me, the LangChain community has provided a robust technical vision of the potential of LLM business applications. The software framework provides accessible tools for JavaScript and Python programmers. In general, the documentation does a nice job of exploring the LLM concepts and patterns for implementation. As an applied researcher, I also appreciate that LangChain connects to all the major LLM platforms. It’s great for tinkering. I have explored creating “retrieval augmented generation” applications connecting to a large ecosystem of vector databases. For personal use, I like using PGVector because I’m a Postgres geek.

I put together this lab to empower learners with concepts around Docker and running applications using Google Cloud run. I enjoyed getting to share this talk locally at DevFest Florida 2024.

If you’re exploring Google cloud platform, I recommend that you connect with the Google Cloud innovators program. They offer nice learning tools and ways to reduce your personal cloud costs during your learning process. Cloud Run has been awesome for helping me get my Docker workloads to production.

You may also find this Google code lab helpful as you explore this topic.
https://codelabs.developers.google.com/codelabs/cloud-run-hello-python3

Lab details and links:
Get Gemini API key here: ai.google.dev/aistudio
Lab details: https://github.com/michaelprosario/langchain-cloudrun-lab

Related Posts

  • Make a Question and Answer Bot with Google Gemini and LangChain
  • librechat.ai: Privacy Focused Client to Your Favorite LLM Tools
  • Google I/O: Practical Stuff for Developers
  • Be the first to comment

    Leave a Reply

    Your email address will not be published.


    *