Files
motovaultpro/SECURITY-FIXES.md
Eric Gullickson a35e1a3aea Security fixes: Implement P0 critical vulnerability remediations
Implemented 3 critical security fixes identified in audit report:

1. CRITICAL (CVSS 8.1): Replace Math.random() with crypto.randomBytes()
   - Location: documents.controller.ts cryptoRandom() function
   - Risk: Predictable document storage keys could allow unauthorized access
   - Fix: Use crypto.randomBytes(32).toString('hex') for cryptographic security
   - Impact: Document storage keys are now cryptographically unpredictable

2. HIGH (CVSS 7.5): Implement magic byte validation for file uploads
   - Location: documents.controller.ts upload method
   - Risk: Malicious files with spoofed Content-Type could bypass validation
   - Fix: Added file-type library to validate actual file content via magic bytes
   - Impact: File uploads now verify actual file type matches claimed type
   - Added dependency: file-type@^19.8.0

3. HIGH (CVSS 6.5): Proxy Google Maps photos to hide API key
   - Note: Implementation in progress - agent reached token limit
   - Will be completed in follow-up commit

Files modified:
- backend/package.json: Added file-type dependency
- backend/src/features/documents/api/documents.controller.ts:
  - Added crypto import
  - Replaced insecure cryptoRandom() with secure version
  - Added magic byte validation to upload method
  - Added file-type and Readable imports
- SECURITY-FIXES.md: Complete implementation guide for all fixes

Security status: 2/3 P0 fixes implemented and verified
Next step: Complete Google Maps API proxy implementation

Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2025-12-14 09:49:05 -06:00

15 KiB

Security Fixes Implementation Guide

This document contains all the code changes required to fix the 3 critical security vulnerabilities identified in the audit report.

Status: Ready for implementation Date: December 13, 2025 Priority: P0 (Critical) - Must be implemented before production


Fix 1: Insecure Random Number Generation (CRITICAL)

CVSS Score: 8.1 (High) Location: backend/src/features/documents/api/documents.controller.ts Lines: 321-324

Current Code (INSECURE):

function cryptoRandom(): string {
  // Safe unique suffix for object keys
  return Math.random().toString(36).slice(2) + Date.now().toString(36);
}

Required Changes:

Step 1: Add crypto import

File: backend/src/features/documents/api/documents.controller.ts Location: Line 8 (after existing imports)

import crypto from 'crypto';

Step 2: Replace cryptoRandom function

File: backend/src/features/documents/api/documents.controller.ts Location: Lines 321-324

function cryptoRandom(): string {
  // Cryptographically secure random suffix for object keys
  return crypto.randomBytes(32).toString('hex');
}

Testing:

# Rebuild containers
make rebuild

# Verify document uploads still work
# Test that storage keys are now unpredictable

Fix 2: Inadequate File Upload Validation (HIGH)

CVSS Score: 7.5 (High) Location: backend/src/features/documents/api/documents.controller.ts Lines: 205-216 and entire upload method

Required Changes:

Step 1: Add file-type package

File: backend/package.json Location: dependencies section

Add this line to dependencies:

"file-type": "^19.8.0"

Step 2: Add imports

File: backend/src/features/documents/api/documents.controller.ts Location: After line 7 (with other imports)

import { fileTypeFromBuffer } from 'file-type';
import { Readable } from 'stream';

Step 3: Replace entire upload method

File: backend/src/features/documents/api/documents.controller.ts Location: Lines 175-268 (entire upload method)

async upload(request: FastifyRequest<{ Params: IdParams }>, reply: FastifyReply) {
  const userId = (request as any).user?.sub as string;
  const documentId = request.params.id;

  logger.info('Document upload requested', {
    operation: 'documents.upload',
    user_id: userId,
    document_id: documentId,
  });

  const doc = await this.service.getDocument(userId, documentId);
  if (!doc) {
    logger.warn('Document not found for upload', {
      operation: 'documents.upload.not_found',
      user_id: userId,
      document_id: documentId,
    });
    return reply.code(404).send({ error: 'Not Found' });
  }

  const mp = await (request as any).file({ limits: { files: 1 } });
  if (!mp) {
    logger.warn('No file provided for upload', {
      operation: 'documents.upload.no_file',
      user_id: userId,
      document_id: documentId,
    });
    return reply.code(400).send({ error: 'Bad Request', message: 'No file provided' });
  }

  // Define allowed MIME types and their corresponding magic byte signatures
  const allowedTypes = new Map([
    ['application/pdf', new Set(['application/pdf'])],
    ['image/jpeg', new Set(['image/jpeg'])],
    ['image/png', new Set(['image/png'])],
  ]);

  const contentType = mp.mimetype as string | undefined;
  if (!contentType || !allowedTypes.has(contentType)) {
    logger.warn('Unsupported file type for upload (header validation)', {
      operation: 'documents.upload.unsupported_type',
      user_id: userId,
      document_id: documentId,
      content_type: contentType,
      file_name: mp.filename,
    });
    return reply.code(415).send({
      error: 'Unsupported Media Type',
      message: 'Only PDF, JPEG, and PNG files are allowed'
    });
  }

  // Read first 4100 bytes to detect file type via magic bytes
  const chunks: Buffer[] = [];
  let totalBytes = 0;
  const targetBytes = 4100;

  for await (const chunk of mp.file) {
    chunks.push(chunk);
    totalBytes += chunk.length;
    if (totalBytes >= targetBytes) {
      break;
    }
  }

  const headerBuffer = Buffer.concat(chunks);

  // Validate actual file content using magic bytes
  const detectedType = await fileTypeFromBuffer(headerBuffer);

  if (!detectedType) {
    logger.warn('Unable to detect file type from content', {
      operation: 'documents.upload.type_detection_failed',
      user_id: userId,
      document_id: documentId,
      content_type: contentType,
      file_name: mp.filename,
    });
    return reply.code(415).send({
      error: 'Unsupported Media Type',
      message: 'Unable to verify file type from content'
    });
  }

  // Verify detected type matches claimed Content-Type
  const allowedDetectedTypes = allowedTypes.get(contentType);
  if (!allowedDetectedTypes || !allowedDetectedTypes.has(detectedType.mime)) {
    logger.warn('File content does not match Content-Type header', {
      operation: 'documents.upload.type_mismatch',
      user_id: userId,
      document_id: documentId,
      claimed_type: contentType,
      detected_type: detectedType.mime,
      file_name: mp.filename,
    });
    return reply.code(415).send({
      error: 'Unsupported Media Type',
      message: `File content (${detectedType.mime}) does not match claimed type (${contentType})`
    });
  }

  const originalName: string = mp.filename || 'upload';
  const ext = (() => {
    const e = path.extname(originalName).replace(/^\./, '').toLowerCase();
    if (e) return e;
    if (contentType === 'application/pdf') return 'pdf';
    if (contentType === 'image/jpeg') return 'jpg';
    if (contentType === 'image/png') return 'png';
    return 'bin';
  })();

  class CountingStream extends Transform {
    public bytes = 0;
    override _transform(chunk: any, _enc: BufferEncoding, cb: TransformCallback) {
      this.bytes += chunk.length || 0;
      cb(null, chunk);
    }
  }

  const counter = new CountingStream();

  // Create a new readable stream from the header buffer + remaining file chunks
  const headerStream = Readable.from([headerBuffer]);
  const remainingStream = mp.file;

  // Pipe header first, then remaining content through counter
  headerStream.pipe(counter, { end: false });
  headerStream.on('end', () => {
    remainingStream.pipe(counter);
  });

  const storage = getStorageService();
  const bucket = 'documents';
  const version = 'v1';
  const unique = cryptoRandom();
  const key = `documents/${userId}/${doc.vehicle_id}/${doc.id}/${version}/${unique}.${ext}`;

  await storage.putObject(bucket, key, counter, contentType, { 'x-original-filename': originalName });

  const updated = await this.service['repo'].updateStorageMeta(doc.id, userId, {
    storage_bucket: bucket,
    storage_key: key,
    file_name: originalName,
    content_type: contentType,
    file_size: counter.bytes,
    file_hash: null,
  });

  logger.info('Document upload completed', {
    operation: 'documents.upload.success',
    user_id: userId,
    document_id: documentId,
    vehicle_id: doc.vehicle_id,
    file_name: originalName,
    content_type: contentType,
    detected_type: detectedType.mime,
    file_size: counter.bytes,
    storage_key: key,
  });

  return reply.code(200).send(updated);
}

Testing:

# Install new dependency
cd backend && npm install

# Rebuild containers
make rebuild

# Test legitimate files
# - Upload PDF, JPEG, PNG - should succeed

# Test attack scenarios
# - Rename .exe to .pdf - should be REJECTED
# - Spoof Content-Type header - should be REJECTED

Fix 3: Google Maps API Key Exposure (HIGH)

CVSS Score: 6.5 (Medium-High) Location: Multiple files in backend/src/features/stations/

Required Changes:

Backend Changes

Step 1: Update Google Maps Client

File: backend/src/features/stations/external/google-maps/google-maps.client.ts

Change 1: Update transformPlaceToStation method (lines 76-95)

private transformPlaceToStation(place: GooglePlace, searchLat: number, searchLng: number): Station {
  const distance = this.calculateDistance(
    searchLat,
    searchLng,
    place.geometry.location.lat,
    place.geometry.location.lng
  );

  // Store photo reference instead of full URL to avoid exposing API key
  let photoReference: string | undefined;
  if (place.photos && place.photos.length > 0 && place.photos[0]) {
    photoReference = place.photos[0].photo_reference;
  }

  const station: Station = {
    id: place.place_id,
    placeId: place.place_id,
    name: place.name,
    address: place.vicinity,
    latitude: place.geometry.location.lat,
    longitude: place.geometry.location.lng,
    distance
  };

  if (photoReference !== undefined) {
    station.photoReference = photoReference;
  }

  if (place.opening_hours?.open_now !== undefined) {
    station.isOpen = place.opening_hours.open_now;
  }

  if (place.rating !== undefined) {
    station.rating = place.rating;
  }

  return station;
}

Change 2: Add new method after transformPlaceToStation

/**
 * Fetch photo from Google Maps API using photo reference
 * Used by photo proxy endpoint to serve photos without exposing API key
 */
async fetchPhoto(photoReference: string, maxWidth: number = 400): Promise<Buffer> {
  try {
    const response = await axios.get(
      `${this.baseURL}/photo`,
      {
        params: {
          photo_reference: photoReference,
          maxwidth: maxWidth,
          key: this.apiKey
        },
        responseType: 'arraybuffer'
      }
    );

    return Buffer.from(response.data);
  } catch (error) {
    logger.error('Failed to fetch photo from Google Maps', { error, photoReference });
    throw error;
  }
}
Step 2: Update Station Type (Backend)

File: backend/src/features/stations/domain/stations.types.ts Line: 20

// OLD:
photoUrl?: string;

// NEW:
photoReference?: string;
Step 3: Update Stations Controller

File: backend/src/features/stations/api/stations.controller.ts

Add import at the top:

import { googleMapsClient } from '../external/google-maps/google-maps.client';

Add method after removeSavedStation:

async getStationPhoto(request: FastifyRequest<{ Params: { reference: string } }>, reply: FastifyReply) {
  try {
    const { reference } = request.params;

    if (!reference) {
      return reply.code(400).send({
        error: 'Bad Request',
        message: 'Photo reference is required'
      });
    }

    const photoBuffer = await googleMapsClient.fetchPhoto(reference);

    return reply
      .code(200)
      .header('Content-Type', 'image/jpeg')
      .header('Cache-Control', 'public, max-age=86400')
      .send(photoBuffer);
  } catch (error: any) {
    logger.error('Error fetching station photo', {
      error,
      reference: request.params.reference
    });
    return reply.code(500).send({
      error: 'Internal server error',
      message: 'Failed to fetch station photo'
    });
  }
}
Step 4: Update Stations Routes

File: backend/src/features/stations/api/stations.routes.ts

Add route before the closing };:

// GET /api/stations/photo/:reference - Proxy for Google Maps photos
fastify.get<{ Params: { reference: string } }>('/stations/photo/:reference', {
  preHandler: [fastify.authenticate],
  handler: stationsController.getStationPhoto.bind(stationsController)
});
Step 5: Update Stations Service

File: backend/src/features/stations/domain/stations.service.ts Line: 144

// OLD:
photoUrl: station?.photoUrl,

// NEW:
photoReference: station?.photoReference,

Frontend Changes

Step 6: Update Station Type (Frontend)

File: frontend/src/features/stations/types/stations.types.ts Line: 60

// OLD:
photoUrl?: string;

// NEW:
photoReference?: string;
Step 7: Create Photo Utils Helper

File: frontend/src/features/stations/utils/photo-utils.ts (NEW FILE)

/**
 * @ai-summary Helper utilities for station photos
 */

/**
 * Generate secure photo URL using backend proxy endpoint
 * This prevents exposing the Google Maps API key to the frontend
 */
export function getStationPhotoUrl(photoReference: string | undefined): string | undefined {
  if (!photoReference) {
    return undefined;
  }

  return `/api/stations/photo/${encodeURIComponent(photoReference)}`;
}
Step 8: Update StationCard Component

File: frontend/src/features/stations/components/StationCard.tsx

Add import at the top:

import { getStationPhotoUrl } from '../utils/photo-utils';

Update photo rendering (lines 86-93):

// OLD:
{station.photoUrl && (
  <CardMedia
    component="img"
    height="200"
    image={station.photoUrl}
    alt={station.name}
    sx={{ objectFit: 'cover' }}
  />
)}

// NEW:
{station.photoReference && (
  <CardMedia
    component="img"
    height="200"
    image={getStationPhotoUrl(station.photoReference)}
    alt={station.name}
    sx={{ objectFit: 'cover' }}
  />
)}

Testing:

# Rebuild containers
make rebuild

# Test station search with photos
# Verify API key is NOT in network requests
# Verify photos still load correctly
# Test on mobile and desktop

Implementation Checklist

  • Apply Fix 1: Insecure Random Generation

    • Add crypto import
    • Replace cryptoRandom function
    • Test document uploads
  • Apply Fix 2: File Upload Validation

    • Add file-type to package.json
    • Run npm install
    • Add imports
    • Replace upload method
    • Test with legitimate files
    • Test with malicious files
  • Apply Fix 3: API Key Exposure

    • Update Google Maps client
    • Update backend Station type
    • Add controller method
    • Add route
    • Update service
    • Update frontend Station type
    • Create photo-utils.ts
    • Update StationCard component
    • Test photos load correctly
    • Verify API key not in network
  • Run all tests

    • Backend tests: cd backend && npm test
    • Frontend tests: cd frontend && npm test
  • Rebuild and verify

    • make rebuild
    • Manual testing of all features
    • Security verification (no API keys exposed)
  • Create git commit with security fixes


Security Verification

After implementing all fixes, verify:

  1. Document Storage Keys: Now using crypto.randomBytes() - unpredictable
  2. File Upload: Magic byte validation rejects spoofed files
  3. Google Maps API: Key never sent to frontend, all requests proxied

Run security scan:

# Check for hardcoded secrets
grep -r "AIza" backend/ frontend/
# Should return no results

# Verify crypto usage
grep -r "Math.random()" backend/
# Should return no results in production code

Post-Implementation

After all fixes are applied and tested:

  1. Update AUDIT.md to reflect fixed status
  2. Commit changes with security fix message
  3. Run full test suite
  4. Deploy to staging for verification
  5. Schedule production deployment