LangChain & LangGraph — Dev Environment Setup
🔗 Developer Setup Guide

LangChain & LangGraph
Development Environment

A complete step-by-step guide to set up your local environment for building LLM-powered applications and agent workflows.

01

Python Version & Virtual Environment

Prerequisites

LangChain requires Python 3.9+. Use pyenv or your system package manager to install the correct version, then create an isolated virtual environment.

# Check Python version
python --version          # Python 3.11.x recommended

# Create & activate a virtual environment
python -m venv .venv
source .venv/bin/activate    # macOS / Linux
# .venv\Scripts\activate     # Windows

# Upgrade pip first
pip install --upgrade pip
02

Install LangChain Core Packages

Installation

Install the main LangChain packages. langchain provides the high-level API, langchain-core is the base layer, and langchain-community bundles third-party integrations.

pip install langchain langchain-core langchain-community

# Provider-specific packages (install what you need)
pip install langchain-openai      # OpenAI / Azure OpenAI
pip install langchain-anthropic   # Anthropic Claude
pip install langchain-google-genai # Google Gemini
pip install langchain-ollama      # Local models via Ollama
03

Install LangGraph

Agents & Flows

LangGraph extends LangChain with a graph-based orchestration layer for building stateful, multi-actor agent workflows with cycles, branching, and persistence.

pip install langgraph

# Optional: LangGraph CLI for local dev server
pip install langgraph-cli

# Optional: checkpoint persistence backends
pip install langgraph-checkpoint-sqlite   # SQLite (easy local)
pip install langgraph-checkpoint-postgres # PostgreSQL (prod)
04

Configure API Keys

Secrets

Never hard-code keys in source files. Use a .env file loaded with python-dotenv, or export them as shell environment variables.

# .env  (add to .gitignore!)
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
LANGCHAIN_API_KEY=ls__...      # LangSmith tracing
LANGCHAIN_TRACING_V2=true
LANGCHAIN_PROJECT=my-project
pip install python-dotenv

# At the top of every script / notebook
from dotenv import load_dotenv
load_dotenv()   # reads .env into os.environ
05

Verify the Installation

Smoke Test

Run this quick sanity check to confirm all packages are importable and the LLM responds correctly.

from dotenv import load_dotenv
load_dotenv()

from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage

llm = ChatOpenAI(model="gpt-4o-mini")
response = llm.invoke([HumanMessage(content="Hello, LangChain!")])
print(response.content)
💡 Expected output: a short greeting from the model. If you see an AuthenticationError, double-check that your .env file is in the project root and load_dotenv() runs before the model call.
06

Your First LangGraph Agent

Hello Graph

A minimal LangGraph workflow: a single-node graph that routes a user message through an LLM and returns the response.

from typing import TypedDict, Annotated
from langgraph.graph import StateGraph, END
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage, BaseMessage
import operator

class State(TypedDict):
    messages: Annotated[list[BaseMessage], operator.add]

llm = ChatOpenAI(model="gpt-4o-mini")

def chat_node(state: State) -> State:
    response = llm.invoke(state["messages"])
    return {"messages": [response]}

graph = StateGraph(State)
graph.add_node("chat", chat_node)
graph.set_entry_point("chat")
graph.add_edge("chat", END)
app = graph.compile()

result = app.invoke({"messages": [HumanMessage("What is LangGraph?")]})
print(result["messages"][-1].content)
07

Recommended Tools & Next Steps

Ecosystem

Your environment is ready. Here’s what to explore next:

  • LangSmith — observability, tracing & prompt management at smith.langchain.com
  • LangGraph Studio — visual graph editor & local dev server (langgraph dev)
  • LangChain Hub — community prompt library at hub.langchain.com
  • LCEL — LangChain Expression Language for composing chains with | pipes
  • Vector Stores — add langchain-chroma, langchain-pinecone, or faiss-cpu for RAG
  • requirements.txt — freeze deps with pip freeze > requirements.txt for reproducibility
📚 Full documentation lives at python.langchain.com and langchain-ai.github.io/langgraph. Both sites include interactive notebooks you can run in Google Colab.
LangChain & LangGraph setup guide · built with ✦ care

Leave a Reply

Your email address will not be published. Required fields are marked *