DEV Community

Cover image for How I Save 10 Hours a Week with Node.js Automation Scripts
Skander Ben Ali
Skander Ben Ali

Posted on

How I Save 10 Hours a Week with Node.js Automation Scripts

How I Save 10 Hours a Week with Node.js Automation Scripts

It hit me one evening as I was manually resizing the 47th product image of the day for a client's Shopify store. I had spent nearly two hours on a task that should have taken minutes. As a freelance developer juggling multiple clients, these "quick" repetitive tasks were silently eating away at my productivity.

The tipping point came when I calculated how much time I was spending on these mundane activities: pulling weekly analytics reports, reformatting spreadsheets, resizing images, updating product listings... it added up to almost 10 hours every week. That's essentially a full workday!

So I did what any self-respecting developer would do – I automated the hell out of it with Node.js.

In this post, I'll walk you through three real-world automation scripts I built that transformed my workflow and freed up a significant chunk of my week. These aren't theoretical examples – they're actual solutions I use with clients today.

Use Case #1: Auto-Generating Client Reports

The Problem

One of my clients, a content marketing agency, needed weekly activity reports pulled from their CMS API. The process was painful: logging into their system, exporting data, cleaning it up in Excel, formatting it nicely, and emailing it over. The whole ordeal took about 45 minutes each week.

The Solution

I wrote a Node.js script that handles the entire workflow. It fetches the data, formats it into a clean report, and emails it automatically every Monday morning. I set it up with a cron job on a $5/month VPS, and I haven't had to think about it since.

Here's the core of the script:

const axios = require('axios');
const nodemailer = require('nodemailer');
const fs = require('fs');
const moment = require('moment');

async function generateReport() {
  // Get date range for the past week
  const endDate = moment().format('YYYY-MM-DD');
  const startDate = moment().subtract(7, 'days').format('YYYY-MM-DD');

  console.log(`Generating report for ${startDate} to ${endDate}...`);

  try {
    // Fetch data with proper authentication
    const { data } = await axios.get('https://client-cms.com/api/posts', {
      headers: { 'Authorization': `Bearer ${process.env.API_KEY}` },
      params: { start: startDate, end: endDate }
    });

    // Process the data into a readable format
    const reportData = {
      totalPosts: data.length,
      viewCount: data.reduce((sum, post) => sum + post.views, 0),
      topPerforming: data.sort((a, b) => b.views - a.views).slice(0, 5),
      dateRange: { startDate, endDate }
    };

    // Write formatted report
    const reportPath = `./reports/weekly-report-${endDate}.json`;
    fs.writeFileSync(reportPath, JSON.stringify(reportData, null, 2));

    return reportPath;
  } catch (error) {
    console.error('Error generating report:', error.message);
    throw error;
  }
}

async function sendEmail(reportPath) {
  // Create reusable transporter using environment variables for security
  let transporter = nodemailer.createTransport({
    service: 'gmail',
    auth: {
      user: process.env.EMAIL_USER,
      pass: process.env.EMAIL_PASS,
    },
  });

  // Format date for email subject
  const reportDate = moment().format('MMMM D, YYYY');

  // Send mail with defined transport object
  await transporter.sendMail({
    from: '"Weekly Report Bot" <[email protected]>',
    to: '[email protected], [email protected]',
    subject: `Content Performance Report - Week of ${reportDate}`,
    text: `Please find attached the weekly content performance report for ${reportDate}.`,
    attachments: [{ filename: `weekly-report-${reportDate}.json`, path: reportPath }],
  });

  console.log('Report sent successfully!');
}

// Run the workflow
generateReport()
  .then(sendEmail)
  .catch(err => console.error('Workflow failed:', err));
Enter fullscreen mode Exit fullscreen mode

The client was thrilled – not only do they get their reports consistently now, but they arrive first thing Monday morning instead of whenever I got around to it.

Use Case #2: Auto-Resizing Images for E-commerce

The Problem

An e-commerce client had strict requirements for product images on their Shopify store. Every image needed to be exactly 1200x1200px with a white background. They were spending hours each week manually editing photos in Photoshop.

The Solution

I built a simple watch folder system. Now they just drop raw product photos into a designated folder, and my script automatically processes them:

const chokidar = require('chokidar');
const sharp = require('sharp');
const path = require('path');
const fs = require('fs');

// Create output directory if it doesn't exist
const outputDir = './processed-images';
if (!fs.existsSync(outputDir)) {
  fs.mkdirSync(outputDir);
}

// Set up file watcher
const watcher = chokidar.watch('./incoming-images', {
  ignored: /(^|\/)\.[^\/\.]/g, // ignore dotfiles
  persistent: true
});

// Process function to resize and center image on white background
async function processImage(filePath) {
  const fileName = path.basename(filePath);
  const outputPath = path.join(outputDir, fileName);

  try {
    // Get image dimensions
    const metadata = await sharp(filePath).metadata();

    // Create 1200x1200 white canvas
    const canvas = sharp({
      create: {
        width: 1200,
        height: 1200,
        channels: 4,
        background: { r: 255, g: 255, b: 255, alpha: 1 }
      }
    });

    // Resize original image to fit within 1100x1100 (leaving some margins)
    const resized = await sharp(filePath)
      .resize(1100, 1100, { 
        fit: 'contain',
        background: { r: 0, g: 0, b: 0, alpha: 0 }
      })
      .toBuffer();

    // Composite resized image onto white canvas (centered)
    await canvas.composite([{ 
      input: resized,
      gravity: 'center'
    }])
    .toFile(outputPath);

    console.log(`✅ Processed: ${fileName}`);
  } catch (err) {
    console.error(`❌ Error processing ${fileName}:`, err);
  }
}

// Set up watcher events
watcher
  .on('add', filePath => {
    console.log(`File added: ${filePath}`);
    processImage(filePath);
  })
  .on('error', error => console.error(`Watcher error: ${error}`));

console.log('Image processor running! Drop files into ./incoming-images');
Enter fullscreen mode Exit fullscreen mode

This script saves them about 3 hours each week. The best part is that it runs locally on their computer, so they don't need to worry about uploading images to a server.

Use Case #3: Batch Uploading Product Data

The Problem

My client's marketing team regularly updates product pricing and descriptions. They maintained this data in spreadsheets but had to manually copy-paste each update into their e-commerce admin panel. With hundreds of products, this was a multi-hour task prone to errors.

The Solution

I wrote a script that reads their CSV file and automatically updates their product database via the API:

const csv = require('csvtojson');
const axios = require('axios');
const fs = require('fs');
const path = require('path');

// Configuration
const API_ENDPOINT = 'https://api.client-store.com/products';
const API_KEY = process.env.PRODUCT_API_KEY;
const csvFilePath = process.argv[2] || './product-updates.csv';

if (!fs.existsSync(csvFilePath)) {
  console.error(`Error: File not found at ${csvFilePath}`);
  process.exit(1);
}

// Setup API client
const api = axios.create({
  baseURL: API_ENDPOINT,
  headers: {
    'Authorization': `Bearer ${API_KEY}`,
    'Content-Type': 'application/json'
  }
});

async function updateProducts() {
  // Keep track of success/failure
  const results = {
    success: 0,
    failed: 0,
    errors: []
  };

  try {
    // Parse CSV to JSON
    const products = await csv().fromFile(csvFilePath);
    console.log(`Found ${products.length} products to update`);

    // Process each product
    for (let [index, product] of products.entries()) {
      try {
        // Simple validation
        if (!product.id || !product.sku) {
          throw new Error('Missing required product identifier (id or sku)');
        }

        console.log(`[${index + 1}/${products.length}] Updating product ${product.sku}...`);

        // Send update to API
        await api.put(`/${product.id}`, product);
        results.success++;

      } catch (err) {
        results.failed++;
        results.errors.push(`Product ${product.sku || index}: ${err.message}`);
        console.error(`  Failed: ${err.message}`);
      }

      // Small delay to avoid rate limiting
      await new Promise(resolve => setTimeout(resolve, 100));
    }

    // Log results
    console.log('\n--- Update Complete ---');
    console.log(`✅ Successfully updated: ${results.success} products`);
    console.log(`❌ Failed updates: ${results.failed} products`);

    if (results.errors.length > 0) {
      fs.writeFileSync(
        './update-errors.log', 
        results.errors.join('\n'), 
        'utf8'
      );
      console.log('See update-errors.log for details on failed updates');
    }

  } catch (err) {
    console.error('Fatal error:', err);
  }
}

updateProducts();
Enter fullscreen mode Exit fullscreen mode

The marketing team now runs this weekly with their updated spreadsheets, saving around 2-3 hours each time.

The Real Impact of Automation

When I tallied up all the time these scripts (and a handful of others) have saved me and my clients, it comes to roughly 10 hours every week. But the benefits go beyond just time savings:

  1. Fewer Errors: Humans make mistakes when doing repetitive tasks. My scripts don't (well, unless I write buggy code).

  2. Consistent Quality: Every report follows the same format, every image is processed exactly the same way.

  3. Better Client Relationships: Tasks get done on schedule, even when I'm busy or on vacation.

  4. More Brain Space: I'm no longer mentally drained from tedious work, leaving more creative energy for solving interesting problems.

  5. Better Profit Margins: I can serve more clients without working more hours.

Lessons I've Learned About Automation

After building dozens of these little automation scripts, I've developed some personal rules:

1. The Rule of Three

If I have to do a task more than twice, I'll automate it the third time. The first two instances help me understand the nuances and edge cases before I codify the process.

2. Log Everything

Automation is only helpful if you can trust it. I add extensive logging to every script so I can verify things are working correctly and quickly troubleshoot when they aren't.

// Bad
await processFile(filePath);

// Good
console.log(`Starting to process: ${filePath}`);
try {
  await processFile(filePath);
  console.log(`Successfully processed: ${filePath}`);
} catch (err) {
  console.error(`Failed to process ${filePath}: ${err.message}`);
}
Enter fullscreen mode Exit fullscreen mode

3. Never Hardcode Secrets

I've seen too many GitHub repos with API keys committed to source control. I always use environment variables or config files for any sensitive information.

4. Make It Reusable

I build scripts with configurability in mind so I can reuse them across different clients with minimal changes.

Ready to Automate Your Own Tasks?

If you're feeling inspired to create your own time-saving scripts, I've published starter templates for the examples above on GitHub: github.com/skanderbenali/node.js-automation-tools

Some useful packages to get you started:

  • node-cron: For scheduling recurring tasks
  • sharp: The Swiss Army knife of image processing
  • axios: For making HTTP requests
  • csvtojson: For working with CSV files
  • chokidar: For file watching
  • nodemailer: For sending emails

What Could You Automate?

I'm curious – what repetitive tasks are eating up your time? Drop a comment below and I might feature your workflow in my next automation article.

Just remember, the goal isn't to automate everything. It's to automate the tedious stuff so you can focus on the work that actually requires your creativity and expertise.

Happy automating!

Top comments (0)