Books/Node.js Essentials/Working with Files and Environment Variables

    Working with Files and Environment Variables

    Working with Files and Environment Variables

    Two of the most common things you'll do in Node.js are reading/writing files and managing secrets like API keys. These come up constantly in AI applications — reading data files, saving outputs, and securely storing your AI API keys.

    Reading and Writing Files

    Node.js has a built-in fs (file system) module. There are three styles you'll see:

    Style 1: Synchronous (Blocking)

    The simplest approach — your code waits for the file operation to complete:

    import fs from "fs";
    
    // Read a file
    const content = fs.readFileSync("data.txt", "utf-8");
    console.log(content);
    
    // Write a file
    fs.writeFileSync("output.txt", "Hello, World!");
    
    // Append to a file
    fs.appendFileSync("log.txt", "New log entry\n");
    
    // Check if a file exists
    const exists = fs.existsSync("data.txt");
    console.log("File exists:", exists);

    Style 2: Promises (Non-Blocking, Modern)

    Better for production code — doesn't block other operations:

    import fs from "fs/promises";
    
    async function processFile() {
      // Read
      const content = await fs.readFile("data.json", "utf-8");
      const data = JSON.parse(content);
    
      // Transform
      data.processedAt = new Date().toISOString();
    
      // Write
      await fs.writeFile("output.json", JSON.stringify(data, null, 2));
      console.log("File processed successfully!");
    }
    
    processFile();

    Style 3: Callbacks (Old Way)

    You'll see this in older code. It works but is harder to read:

    import fs from "fs";
    
    fs.readFile("data.txt", "utf-8", (err, content) => {
      if (err) {
        console.error("Error reading file:", err);
        return;
      }
      console.log(content);
    });

    Which Style Should You Use?

    StyleWhen to Use
    readFileSyncQuick scripts, configuration loading at startup
    fs/promisesProduction code, async operations, inside async functions
    CallbacksOnly when working with legacy code

    Common File Operations

    import fs from "fs/promises";
    
    // Read and parse JSON
    const rawData = await fs.readFile("users.json", "utf-8");
    const users = JSON.parse(rawData);
    
    // Write JSON (pretty-printed with 2-space indent)
    await fs.writeFile("output.json", JSON.stringify(users, null, 2));
    
    // Create a directory
    await fs.mkdir("output", { recursive: true });
    
    // List files in a directory
    const files = await fs.readdir("./data");
    console.log("Files:", files);
    
    // Delete a file
    await fs.unlink("temp.txt");
    
    // Copy a file
    await fs.copyFile("source.txt", "backup.txt");

    What to ask your AI: "Write a Node.js script that reads all JSON files from a directory, combines them, and writes the result to a single output file."

    The path Module

    When working with files, you need to handle file paths correctly. The built-in path module helps with this:

    import path from "path";
    
    // Join path segments (handles slashes correctly on any OS)
    const filePath = path.join("data", "users", "profile.json");
    // Result: "data/users/profile.json" (Mac/Linux)
    // Result: "data\\users\\profile.json" (Windows)
    
    // Get the file name from a path
    path.basename("/home/user/document.pdf");  // "document.pdf"
    
    // Get the directory name
    path.dirname("/home/user/document.pdf");   // "/home/user"
    
    // Get the file extension
    path.extname("photo.jpg");                 // ".jpg"
    
    // Create an absolute path from a relative one
    const absolute = path.resolve("data", "file.txt");
    // Result: "/Users/you/project/data/file.txt"

    A very common pattern you'll see in AI-generated code:

    import path from "path";
    import { fileURLToPath } from "url";
    
    // Get the current file's directory (ES Modules)
    const __filename = fileURLToPath(import.meta.url);
    const __dirname = path.dirname(__filename);
    
    // Now you can build paths relative to this file
    const dataPath = path.join(__dirname, "..", "data", "config.json");

    What to ask your AI: "I need to read a file that's in a subfolder relative to my script. How do I build the correct path?"

    Environment Variables

    Environment variables are values stored outside your code that your application reads at runtime. They're used for:

    • API keys (OpenAI, Anthropic, Firebase)
    • Database connection strings
    • Server configuration (port, host)
    • Feature flags

    Reading Environment Variables

    // Access environment variables through process.env
    const apiKey = process.env.OPENAI_API_KEY;
    const port = process.env.PORT || "3000";
    const nodeEnv = process.env.NODE_ENV || "development";
    
    console.log("Running on port:", port);
    console.log("Environment:", nodeEnv);
    
    // Always validate required variables
    if (!apiKey) {
      throw new Error("OPENAI_API_KEY environment variable is required!");
    }

    Setting Environment Variables

    You can set them temporarily when running a command:

    # Set a variable for one command
    PORT=4000 node server.js
    
    # Set multiple variables
    PORT=4000 NODE_ENV=production node server.js

    But this gets tedious. That's where .env files come in.

    .env Files and dotenv

    A .env file stores your environment variables in a simple key=value format:

    # .env
    OPENAI_API_KEY=sk-abc123def456...
    ANTHROPIC_API_KEY=sk-ant-abc123...
    DATABASE_URL=postgresql://user:pass@localhost:5432/mydb
    PORT=3000
    NODE_ENV=development

    To load these variables into your application, use the dotenv package:

    npm install dotenv
    // Load .env variables at the very top of your entry file
    import dotenv from "dotenv";
    dotenv.config();
    
    // Now process.env has all your .env variables
    const apiKey = process.env.OPENAI_API_KEY;
    console.log("API key loaded:", apiKey ? "Yes" : "No");

    Critical Security Rules

    RuleWhy
    Never commit .env to GitYour API keys would be visible to anyone
    Add .env to .gitignorePrevents accidental commits
    Create a .env.exampleShows what variables are needed (without real values)
    Never hardcode secretsDon't put API keys directly in your code

    A typical .env.example file:

    # .env.example (commit this to Git)
    OPENAI_API_KEY=your-api-key-here
    DATABASE_URL=your-database-url-here
    PORT=3000

    Why env vars Matter for AI Apps

    Almost every AI application needs secret keys:

    import dotenv from "dotenv";
    dotenv.config();
    
    import { OpenAI } from "openai";
    
    // The API key comes from the environment — never hardcoded
    const openai = new OpenAI({
      apiKey: process.env.OPENAI_API_KEY,
    });
    
    async function chat(message: string) {
      const response = await openai.chat.completions.create({
        model: "gpt-4",
        messages: [{ role: "user", content: message }],
      });
      return response.choices[0].message.content;
    }

    What to ask your AI: "Set up environment variable handling for my project. I need variables for [list your API keys and config]."

    Command Line Arguments with process.argv

    You can pass arguments to your Node.js scripts:

    node script.js hello world --verbose
    // process.argv is an array of all arguments
    console.log(process.argv);
    // [
    //   "/usr/local/bin/node",    // Path to Node.js
    //   "/path/to/script.js",    // Path to your script
    //   "hello",                  // Your first argument
    //   "world",                  // Your second argument
    //   "--verbose"               // Your third argument
    // ]
    
    // Skip the first two (node path and script path)
    const args = process.argv.slice(2);
    console.log(args);  // ["hello", "world", "--verbose"]
    
    // Practical example: a script that processes a specific file
    const inputFile = process.argv[2];
    if (!inputFile) {
      console.error("Usage: node script.js <input-file>");
      process.exit(1);
    }
    
    const content = fs.readFileSync(inputFile, "utf-8");
    console.log(`Processing ${inputFile}...`);

    Putting It Together: A Practical Script

    Here's a script that combines everything — reading files, using environment variables, and processing data:

    import dotenv from "dotenv";
    dotenv.config();
    
    import fs from "fs/promises";
    import path from "path";
    
    async function main() {
      // Get the input file from command line args
      const inputFile = process.argv[2] || "data/input.json";
    
      // Read the input
      const raw = await fs.readFile(inputFile, "utf-8");
      const data = JSON.parse(raw);
    
      console.log(`Loaded ${data.length} items from ${inputFile}`);
    
      // Process and write output
      const outputDir = path.join(process.cwd(), "output");
      await fs.mkdir(outputDir, { recursive: true });
    
      const outputFile = path.join(outputDir, "result.json");
      await fs.writeFile(outputFile, JSON.stringify(data, null, 2));
    
      console.log(`Results saved to ${outputFile}`);
    }
    
    main().catch(console.error);

    What's Next?

    You can now work with files and environment variables — essential skills for any Node.js application. The next tutorial covers HTTP basics and Express — building web servers and APIs.

    What to ask your AI: "Create a Node.js script that reads a CSV file, transforms the data, and writes the result as JSON. Use proper error handling."


    🌐 www.genai-mentor.ai