Part 3: First AI Integration
Recap: Parts 1-2 established our foundation and created a basic Node.js project structure.
Today we make the exciting leap to an AI-powered application by connecting to Azure OpenAI Service.
Azure OpenAI Setup
Request access at portal.azure.com (takes 1-2 days approval). Once approved, create an OpenAI resource and note your endpoint, API key, and deployment name.
Add AI Dependencies
npm install @azure/openai
Update .env
:
AZURE_OPENAI_API_KEY=your_key
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
AZURE_OPENAI_DEPLOYMENT=your_deployment
Create AI Service
Create src/services/aiService.js
:
const { OpenAIApi, Configuration } = require('@azure/openai');
const client = new OpenAIApi(new Configuration({
apiKey: process.env.AZURE_OPENAI_API_KEY,
basePath: `${process.env.AZURE_OPENAI_ENDPOINT}/openai/deployments/${process.env.AZURE_OPENAI_DEPLOYMENT}`,
baseOptions: {
headers: { 'api-key': process.env.AZURE_OPENAI_API_KEY },
params: { 'api-version': '2023-12-01-preview' }
}
}));
async function getAIResponse(message) {
const response = await client.createChatCompletion({
model: process.env.AZURE_OPENAI_DEPLOYMENT,
messages: [
{ role: 'system', content: 'You are a helpful customer support assistant.' },
{ role: 'user', content: message }
]
});
return response.data.choices[0].message.content;
}
module.exports = { getAIResponse };
Add Chat Endpoint
Update src/app.js
:
require('dotenv').config();
const express = require('express');
const { getAIResponse } = require('./services/aiService');
const app = express();
app.use(express.json());
app.post('/chat', async (req, res) => {
const aiResponse = await getAIResponse(req.body.message);
res.json({ response: aiResponse });
});
app.listen(3000, () => console.log('AI server running'));
Test It
curl -X POST http://localhost:3000/chat \
-H "Content-Type: application/json" \
-d '{"message": "Hi, I need help"}'
You now have an AI-powered application! In Part 4, we’ll add LangChain for conversation memory.