Vehicle ETL Process fixed. Admin settings fixed.
This commit is contained in:
@@ -95,7 +95,8 @@
|
|||||||
"Bash(tail:*)",
|
"Bash(tail:*)",
|
||||||
"mcp__playwright__browser_close",
|
"mcp__playwright__browser_close",
|
||||||
"Bash(wc:*)",
|
"Bash(wc:*)",
|
||||||
"mcp__brave-search__brave_web_search"
|
"mcp__brave-search__brave_web_search",
|
||||||
|
"mcp__firecrawl__firecrawl_search"
|
||||||
],
|
],
|
||||||
"deny": []
|
"deny": []
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,180 +0,0 @@
|
|||||||
# Stations (Gas/Fuel) Feature — Dispatchable Change Plan
|
|
||||||
|
|
||||||
This document is written as an execution plan that can be handed to multiple AI agents to implement in parallel.
|
|
||||||
|
|
||||||
## Repo Constraints (Must Follow)
|
|
||||||
|
|
||||||
- Docker-first workflow (production builds): validate changes via `make rebuild` and container logs.
|
|
||||||
- Mobile + Desktop requirement: every UI change must be validated on both `frontend/src/features/stations/pages/StationsPage.tsx` (desktop) and `frontend/src/features/stations/mobile/StationsMobileScreen.tsx` (mobile).
|
|
||||||
- Never expose the Google Maps API key to the browser or logs.
|
|
||||||
|
|
||||||
## Scope
|
|
||||||
|
|
||||||
1. Fix broken station photo rendering on stations UI after the “hide Google API key” change.
|
|
||||||
2. Add navigation links for saved/favorite stations:
|
|
||||||
- “Navigate in Google” (Google Maps)
|
|
||||||
- “Navigate in Apple Maps” (Apple Maps)
|
|
||||||
- “Navigate in Waze” (Waze)
|
|
||||||
|
|
||||||
## Bug: Station Photos Not Displaying
|
|
||||||
|
|
||||||
### Current Implementation (What Exists Today)
|
|
||||||
|
|
||||||
- Frontend cards render an `<img>` via MUI `CardMedia` when `station.photoReference` is present:
|
|
||||||
- `frontend/src/features/stations/components/StationCard.tsx`
|
|
||||||
- URL generation: `frontend/src/features/stations/utils/photo-utils.ts` → `/api/stations/photo/:reference`
|
|
||||||
- Backend exposes a proxy endpoint that fetches the Google Places photo (server-side, using the secret key):
|
|
||||||
- Route: `GET /api/stations/photo/:reference`
|
|
||||||
- `backend/src/features/stations/api/stations.routes.ts`
|
|
||||||
- `backend/src/features/stations/api/stations.controller.ts`
|
|
||||||
- Google client: `backend/src/features/stations/external/google-maps/google-maps.client.ts` (`fetchPhoto`)
|
|
||||||
|
|
||||||
### Likely Root Cause (Agents Must Confirm)
|
|
||||||
|
|
||||||
The photo endpoint is protected by `fastify.authenticate`, but `<img src="...">` requests do not include the Authorization header. This results in `401 Unauthorized` responses and broken images.
|
|
||||||
|
|
||||||
Second thing to confirm while debugging:
|
|
||||||
- Verify what `Station.photoReference` contains at runtime:
|
|
||||||
- expected: Google `photo_reference` token
|
|
||||||
- risk: code/docs mismatch where `photoReference` became a URL like `/api/stations/photo/{reference}`, causing double-encoding by `getStationPhotoUrl()`.
|
|
||||||
|
|
||||||
### Repro Checklist (Fast Confirmation)
|
|
||||||
|
|
||||||
- Open stations page and observe broken images in browser devtools Network:
|
|
||||||
- `GET /api/stations/photo/...` should show `401` if auth-header issue is the cause.
|
|
||||||
- Confirm backend logs show JWT auth failure for photo requests.
|
|
||||||
|
|
||||||
## Decision: Image Strategy (Selected)
|
|
||||||
|
|
||||||
Selected: **Option A1** (keep images; authenticated blob fetch in frontend; photo endpoint remains JWT-protected).
|
|
||||||
|
|
||||||
### Option A (Keep Images): Fix Auth Mismatch Without Exposing API Key
|
|
||||||
|
|
||||||
#### Option A1 (Recommended): Fetch Photo as Blob via Authenticated XHR
|
|
||||||
|
|
||||||
Why: Keeps `/api/stations/photo/:reference` protected (prevents public key abuse), avoids putting JWT in query params, and avoids exposing the Google API key.
|
|
||||||
|
|
||||||
Implementation outline:
|
|
||||||
- Frontend: replace direct `<img src="/api/stations/photo/...">` usage with an authenticated fetch that includes the JWT (via existing Axios `apiClient` interceptors), then render via `blob:` object URL.
|
|
||||||
- Add a small component like `StationPhoto` used by `StationCard`:
|
|
||||||
- `apiClient.get('/stations/photo/:reference', { responseType: 'blob' })`
|
|
||||||
- `URL.createObjectURL(blob)` for display
|
|
||||||
- `URL.revokeObjectURL` cleanup on unmount / reference change
|
|
||||||
- graceful fallback (hide image) on 401/500
|
|
||||||
- Backend: no route auth changes required.
|
|
||||||
|
|
||||||
Tradeoffs:
|
|
||||||
- Slightly more frontend code, but minimal security risk.
|
|
||||||
- Must ensure caching behavior is acceptable (browser cache won’t cache `blob:` URLs; rely on backend caching headers + client-side memoization).
|
|
||||||
|
|
||||||
### Option B (Remove Images): Simplify Cards
|
|
||||||
|
|
||||||
Why: If image delivery adds too much complexity or risk, remove images from station cards.
|
|
||||||
|
|
||||||
Implementation outline:
|
|
||||||
- Frontend: remove `CardMedia` photo block from `StationCard` and any other station photo rendering.
|
|
||||||
- Leave `photoReference` in API/types untouched for now (or remove later as a cleanup task, separate PR).
|
|
||||||
- Update any tests that assert on image presence.
|
|
||||||
|
|
||||||
Tradeoffs:
|
|
||||||
- Reduced UX polish, but simplest and most robust.
|
|
||||||
|
|
||||||
## Feature: Navigation Links on Saved/Favorite Stations
|
|
||||||
|
|
||||||
### UX Requirements
|
|
||||||
|
|
||||||
- On saved station UI (desktop + mobile), provide 3 explicit navigation options:
|
|
||||||
- Google Maps
|
|
||||||
- Apple Maps
|
|
||||||
- Waze
|
|
||||||
- “Saved/favorite” is interpreted as “stations in the Saved list”; favorites are a subset.
|
|
||||||
|
|
||||||
### URL Construction (Preferred)
|
|
||||||
|
|
||||||
Use coordinates if available; fall back to address query if not.
|
|
||||||
|
|
||||||
- Google Maps:
|
|
||||||
- Preferred: `https://www.google.com/maps/dir/?api=1&destination=LAT,LNG&destination_place_id=PLACE_ID`
|
|
||||||
- Fallback: `https://www.google.com/maps/search/?api=1&query=ENCODED_QUERY`
|
|
||||||
- Apple Maps:
|
|
||||||
- Preferred: `https://maps.apple.com/?daddr=LAT,LNG`
|
|
||||||
- Fallback: `https://maps.apple.com/?q=ENCODED_QUERY`
|
|
||||||
- Waze:
|
|
||||||
- Preferred: `https://waze.com/ul?ll=LAT,LNG&navigate=yes`
|
|
||||||
- Fallback: `https://waze.com/ul?q=ENCODED_QUERY&navigate=yes`
|
|
||||||
|
|
||||||
Important: some saved stations may have `latitude/longitude = 0` if cache miss; treat `(0,0)` as “no coordinates”.
|
|
||||||
|
|
||||||
### UI Placement Recommendation
|
|
||||||
|
|
||||||
- Desktop saved list: add a “Navigate” icon button that opens a small menu with the 3 links (cleaner than inline links inside `ListItemText`).
|
|
||||||
- File: `frontend/src/features/stations/components/SavedStationsList.tsx`
|
|
||||||
- Mobile bottom sheet (station details): add a “Navigate” section with the same 3 links as buttons.
|
|
||||||
- File: `frontend/src/features/stations/mobile/StationsMobileScreen.tsx`
|
|
||||||
|
|
||||||
## Work Breakdown for Multiple Agents
|
|
||||||
|
|
||||||
### Agent 1 — Confirm Root Cause + Backend Adjustments (If Needed)
|
|
||||||
|
|
||||||
Deliverables:
|
|
||||||
- Confirm whether photo requests return `401` due to missing Authorization.
|
|
||||||
- Confirm whether `photoReference` is a raw reference token vs a URL string.
|
|
||||||
- Implement backend changes only if Option A2 is chosen.
|
|
||||||
|
|
||||||
Files likely touched (Option A2 only):
|
|
||||||
- `backend/src/features/stations/api/stations.routes.ts` (remove auth preHandler on photo route)
|
|
||||||
- `backend/src/features/stations/api/stations.controller.ts` (add stricter validation; keep cache headers)
|
|
||||||
- `backend/src/features/stations/docs/API.md` (update auth expectations for photo endpoint)
|
|
||||||
|
|
||||||
### Agent 2 — Frontend Photo Fix (Option A1) OR Photo Removal (Option B)
|
|
||||||
|
|
||||||
Deliverables:
|
|
||||||
- Option A1: implement authenticated blob photo loading for station cards.
|
|
||||||
- Option B: remove station photos from cards cleanly (no layout regressions).
|
|
||||||
|
|
||||||
Files likely touched:
|
|
||||||
- `frontend/src/features/stations/components/StationCard.tsx`
|
|
||||||
- Option A1:
|
|
||||||
- Add `frontend/src/features/stations/components/StationPhoto.tsx` (or similar)
|
|
||||||
- Potentially update `frontend/src/features/stations/utils/photo-utils.ts`
|
|
||||||
- Add unit tests under `frontend/src/features/stations/__tests__/`
|
|
||||||
|
|
||||||
### Agent 3 — Navigation Links for Saved Stations (Desktop + Mobile)
|
|
||||||
|
|
||||||
Deliverables:
|
|
||||||
- Create a single URL-builder utility with tests.
|
|
||||||
- Add a “Navigate” menu/section in saved stations UI (desktop + mobile).
|
|
||||||
|
|
||||||
Files likely touched:
|
|
||||||
- `frontend/src/features/stations/utils/` (new `navigation-links.ts`)
|
|
||||||
- `frontend/src/features/stations/components/SavedStationsList.tsx`
|
|
||||||
- `frontend/src/features/stations/mobile/StationsMobileScreen.tsx`
|
|
||||||
- Optional: reuse in `frontend/src/features/stations/components/StationCard.tsx` (only if product wants it outside Saved)
|
|
||||||
|
|
||||||
### Agent 4 — Tests + QA Pass (Update What Breaks)
|
|
||||||
|
|
||||||
Deliverables:
|
|
||||||
- Update/extend tests to cover:
|
|
||||||
- navigation menu/links present for saved stations
|
|
||||||
- photo rendering behavior per chosen option
|
|
||||||
- Ensure both desktop and mobile flows still pass basic E2E checks.
|
|
||||||
|
|
||||||
Files likely touched:
|
|
||||||
- `frontend/cypress/e2e/stations.cy.ts`
|
|
||||||
- `frontend/src/features/stations/__tests__/components/StationCard.test.tsx`
|
|
||||||
- New tests for `navigation-links.ts`
|
|
||||||
|
|
||||||
## Acceptance Criteria
|
|
||||||
|
|
||||||
- Station photos render on station cards via Option A1 without exposing Google API key (no `401` responses for photo requests in Network).
|
|
||||||
- Saved stations show 3 navigation options (Google, Apple, Waze) on both desktop and mobile.
|
|
||||||
- No lint/test regressions; container build succeeds.
|
|
||||||
|
|
||||||
## Validation (Container-First)
|
|
||||||
|
|
||||||
- Rebuild and watch logs: `make rebuild` then `make logs`
|
|
||||||
- Optional focused logs: `make logs-frontend` and `make logs-backend`
|
|
||||||
- Run feature tests where available (prefer container exec):
|
|
||||||
- Backend: `docker compose exec mvp-backend npm test -- features/stations`
|
|
||||||
- Frontend: `docker compose exec mvp-frontend npm test -- stations`
|
|
||||||
- E2E: `docker compose exec mvp-frontend npm run e2e`
|
|
||||||
@@ -20,6 +20,7 @@ import { StationOversightService } from '../domain/station-oversight.service';
|
|||||||
import { StationsController } from './stations.controller';
|
import { StationsController } from './stations.controller';
|
||||||
import { CatalogController } from './catalog.controller';
|
import { CatalogController } from './catalog.controller';
|
||||||
import { VehicleCatalogService } from '../domain/vehicle-catalog.service';
|
import { VehicleCatalogService } from '../domain/vehicle-catalog.service';
|
||||||
|
import { CatalogImportService } from '../domain/catalog-import.service';
|
||||||
import { PlatformCacheService } from '../../platform/domain/platform-cache.service';
|
import { PlatformCacheService } from '../../platform/domain/platform-cache.service';
|
||||||
import { cacheService } from '../../../core/config/redis';
|
import { cacheService } from '../../../core/config/redis';
|
||||||
import { pool } from '../../../core/config/database';
|
import { pool } from '../../../core/config/database';
|
||||||
@@ -35,7 +36,9 @@ export const adminRoutes: FastifyPluginAsync = async (fastify) => {
|
|||||||
// Initialize catalog dependencies
|
// Initialize catalog dependencies
|
||||||
const platformCacheService = new PlatformCacheService(cacheService);
|
const platformCacheService = new PlatformCacheService(cacheService);
|
||||||
const catalogService = new VehicleCatalogService(pool, platformCacheService);
|
const catalogService = new VehicleCatalogService(pool, platformCacheService);
|
||||||
|
const catalogImportService = new CatalogImportService(pool);
|
||||||
const catalogController = new CatalogController(catalogService);
|
const catalogController = new CatalogController(catalogService);
|
||||||
|
catalogController.setImportService(catalogImportService);
|
||||||
|
|
||||||
// Admin access verification (used by frontend auth checks)
|
// Admin access verification (used by frontend auth checks)
|
||||||
fastify.get('/admin/verify', {
|
fastify.get('/admin/verify', {
|
||||||
@@ -205,6 +208,49 @@ export const adminRoutes: FastifyPluginAsync = async (fastify) => {
|
|||||||
handler: catalogController.getChangeLogs.bind(catalogController)
|
handler: catalogController.getChangeLogs.bind(catalogController)
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// Search endpoint - full-text search across vehicle_options
|
||||||
|
fastify.get('/admin/catalog/search', {
|
||||||
|
preHandler: [fastify.requireAdmin],
|
||||||
|
handler: catalogController.searchCatalog.bind(catalogController)
|
||||||
|
});
|
||||||
|
|
||||||
|
// Cascade delete endpoints - delete entity and all its children
|
||||||
|
fastify.delete('/admin/catalog/makes/:makeId/cascade', {
|
||||||
|
preHandler: [fastify.requireAdmin],
|
||||||
|
handler: catalogController.deleteMakeCascade.bind(catalogController)
|
||||||
|
});
|
||||||
|
|
||||||
|
fastify.delete('/admin/catalog/models/:modelId/cascade', {
|
||||||
|
preHandler: [fastify.requireAdmin],
|
||||||
|
handler: catalogController.deleteModelCascade.bind(catalogController)
|
||||||
|
});
|
||||||
|
|
||||||
|
fastify.delete('/admin/catalog/years/:yearId/cascade', {
|
||||||
|
preHandler: [fastify.requireAdmin],
|
||||||
|
handler: catalogController.deleteYearCascade.bind(catalogController)
|
||||||
|
});
|
||||||
|
|
||||||
|
fastify.delete('/admin/catalog/trims/:trimId/cascade', {
|
||||||
|
preHandler: [fastify.requireAdmin],
|
||||||
|
handler: catalogController.deleteTrimCascade.bind(catalogController)
|
||||||
|
});
|
||||||
|
|
||||||
|
// Import/Export endpoints
|
||||||
|
fastify.post('/admin/catalog/import/preview', {
|
||||||
|
preHandler: [fastify.requireAdmin],
|
||||||
|
handler: catalogController.importPreview.bind(catalogController)
|
||||||
|
});
|
||||||
|
|
||||||
|
fastify.post('/admin/catalog/import/apply', {
|
||||||
|
preHandler: [fastify.requireAdmin],
|
||||||
|
handler: catalogController.importApply.bind(catalogController)
|
||||||
|
});
|
||||||
|
|
||||||
|
fastify.get('/admin/catalog/export', {
|
||||||
|
preHandler: [fastify.requireAdmin],
|
||||||
|
handler: catalogController.exportCatalog.bind(catalogController)
|
||||||
|
});
|
||||||
|
|
||||||
// Bulk delete endpoint
|
// Bulk delete endpoint
|
||||||
fastify.delete<{ Params: { entity: CatalogEntity }; Body: BulkDeleteCatalogInput }>('/admin/catalog/:entity/bulk-delete', {
|
fastify.delete<{ Params: { entity: CatalogEntity }; Body: BulkDeleteCatalogInput }>('/admin/catalog/:entity/bulk-delete', {
|
||||||
preHandler: [fastify.requireAdmin],
|
preHandler: [fastify.requireAdmin],
|
||||||
|
|||||||
@@ -5,11 +5,18 @@
|
|||||||
|
|
||||||
import { FastifyRequest, FastifyReply } from 'fastify';
|
import { FastifyRequest, FastifyReply } from 'fastify';
|
||||||
import { VehicleCatalogService } from '../domain/vehicle-catalog.service';
|
import { VehicleCatalogService } from '../domain/vehicle-catalog.service';
|
||||||
|
import { CatalogImportService } from '../domain/catalog-import.service';
|
||||||
import { logger } from '../../../core/logging/logger';
|
import { logger } from '../../../core/logging/logger';
|
||||||
|
|
||||||
export class CatalogController {
|
export class CatalogController {
|
||||||
|
private importService: CatalogImportService | null = null;
|
||||||
|
|
||||||
constructor(private catalogService: VehicleCatalogService) {}
|
constructor(private catalogService: VehicleCatalogService) {}
|
||||||
|
|
||||||
|
setImportService(importService: CatalogImportService): void {
|
||||||
|
this.importService = importService;
|
||||||
|
}
|
||||||
|
|
||||||
// MAKES ENDPOINTS
|
// MAKES ENDPOINTS
|
||||||
|
|
||||||
async getMakes(_request: FastifyRequest, reply: FastifyReply): Promise<void> {
|
async getMakes(_request: FastifyRequest, reply: FastifyReply): Promise<void> {
|
||||||
@@ -593,6 +600,221 @@ export class CatalogController {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// SEARCH ENDPOINT
|
||||||
|
|
||||||
|
async searchCatalog(
|
||||||
|
request: FastifyRequest<{ Querystring: { q?: string; page?: string; pageSize?: string } }>,
|
||||||
|
reply: FastifyReply
|
||||||
|
): Promise<void> {
|
||||||
|
try {
|
||||||
|
const query = request.query.q || '';
|
||||||
|
const page = parseInt(request.query.page || '1');
|
||||||
|
const pageSize = Math.min(parseInt(request.query.pageSize || '50'), 100); // Max 100 per page
|
||||||
|
|
||||||
|
if (isNaN(page) || page < 1) {
|
||||||
|
reply.code(400).send({ error: 'Invalid page number' });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (isNaN(pageSize) || pageSize < 1) {
|
||||||
|
reply.code(400).send({ error: 'Invalid page size' });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await this.catalogService.searchCatalog(query, page, pageSize);
|
||||||
|
reply.code(200).send(result);
|
||||||
|
} catch (error) {
|
||||||
|
logger.error('Error searching catalog', { error });
|
||||||
|
reply.code(500).send({ error: 'Failed to search catalog' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// CASCADE DELETE ENDPOINTS
|
||||||
|
|
||||||
|
async deleteMakeCascade(
|
||||||
|
request: FastifyRequest<{ Params: { makeId: string } }>,
|
||||||
|
reply: FastifyReply
|
||||||
|
): Promise<void> {
|
||||||
|
try {
|
||||||
|
const makeId = parseInt(request.params.makeId);
|
||||||
|
const actorId = request.userContext?.userId || 'unknown';
|
||||||
|
|
||||||
|
if (isNaN(makeId)) {
|
||||||
|
reply.code(400).send({ error: 'Invalid make ID' });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await this.catalogService.deleteMakeCascade(makeId, actorId);
|
||||||
|
reply.code(200).send(result);
|
||||||
|
} catch (error: any) {
|
||||||
|
logger.error('Error cascade deleting make', { error });
|
||||||
|
if (error.message?.includes('not found')) {
|
||||||
|
reply.code(404).send({ error: error.message });
|
||||||
|
} else {
|
||||||
|
reply.code(500).send({ error: 'Failed to cascade delete make' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async deleteModelCascade(
|
||||||
|
request: FastifyRequest<{ Params: { modelId: string } }>,
|
||||||
|
reply: FastifyReply
|
||||||
|
): Promise<void> {
|
||||||
|
try {
|
||||||
|
const modelId = parseInt(request.params.modelId);
|
||||||
|
const actorId = request.userContext?.userId || 'unknown';
|
||||||
|
|
||||||
|
if (isNaN(modelId)) {
|
||||||
|
reply.code(400).send({ error: 'Invalid model ID' });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await this.catalogService.deleteModelCascade(modelId, actorId);
|
||||||
|
reply.code(200).send(result);
|
||||||
|
} catch (error: any) {
|
||||||
|
logger.error('Error cascade deleting model', { error });
|
||||||
|
if (error.message?.includes('not found')) {
|
||||||
|
reply.code(404).send({ error: error.message });
|
||||||
|
} else {
|
||||||
|
reply.code(500).send({ error: 'Failed to cascade delete model' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async deleteYearCascade(
|
||||||
|
request: FastifyRequest<{ Params: { yearId: string } }>,
|
||||||
|
reply: FastifyReply
|
||||||
|
): Promise<void> {
|
||||||
|
try {
|
||||||
|
const yearId = parseInt(request.params.yearId);
|
||||||
|
const actorId = request.userContext?.userId || 'unknown';
|
||||||
|
|
||||||
|
if (isNaN(yearId)) {
|
||||||
|
reply.code(400).send({ error: 'Invalid year ID' });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await this.catalogService.deleteYearCascade(yearId, actorId);
|
||||||
|
reply.code(200).send(result);
|
||||||
|
} catch (error: any) {
|
||||||
|
logger.error('Error cascade deleting year', { error });
|
||||||
|
if (error.message?.includes('not found')) {
|
||||||
|
reply.code(404).send({ error: error.message });
|
||||||
|
} else {
|
||||||
|
reply.code(500).send({ error: 'Failed to cascade delete year' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async deleteTrimCascade(
|
||||||
|
request: FastifyRequest<{ Params: { trimId: string } }>,
|
||||||
|
reply: FastifyReply
|
||||||
|
): Promise<void> {
|
||||||
|
try {
|
||||||
|
const trimId = parseInt(request.params.trimId);
|
||||||
|
const actorId = request.userContext?.userId || 'unknown';
|
||||||
|
|
||||||
|
if (isNaN(trimId)) {
|
||||||
|
reply.code(400).send({ error: 'Invalid trim ID' });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await this.catalogService.deleteTrimCascade(trimId, actorId);
|
||||||
|
reply.code(200).send(result);
|
||||||
|
} catch (error: any) {
|
||||||
|
logger.error('Error cascade deleting trim', { error });
|
||||||
|
if (error.message?.includes('not found')) {
|
||||||
|
reply.code(404).send({ error: error.message });
|
||||||
|
} else {
|
||||||
|
reply.code(500).send({ error: 'Failed to cascade delete trim' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// IMPORT/EXPORT ENDPOINTS
|
||||||
|
|
||||||
|
async importPreview(
|
||||||
|
request: FastifyRequest,
|
||||||
|
reply: FastifyReply
|
||||||
|
): Promise<void> {
|
||||||
|
try {
|
||||||
|
if (!this.importService) {
|
||||||
|
reply.code(500).send({ error: 'Import service not configured' });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const data = await request.file();
|
||||||
|
if (!data) {
|
||||||
|
reply.code(400).send({ error: 'No file uploaded' });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const buffer = await data.toBuffer();
|
||||||
|
const csvContent = buffer.toString('utf-8');
|
||||||
|
|
||||||
|
const result = await this.importService.previewImport(csvContent);
|
||||||
|
reply.code(200).send(result);
|
||||||
|
} catch (error: any) {
|
||||||
|
logger.error('Error previewing import', { error });
|
||||||
|
reply.code(500).send({ error: error.message || 'Failed to preview import' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async importApply(
|
||||||
|
request: FastifyRequest<{ Body: { previewId: string } }>,
|
||||||
|
reply: FastifyReply
|
||||||
|
): Promise<void> {
|
||||||
|
try {
|
||||||
|
if (!this.importService) {
|
||||||
|
reply.code(500).send({ error: 'Import service not configured' });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const { previewId } = request.body;
|
||||||
|
const actorId = request.userContext?.userId || 'unknown';
|
||||||
|
|
||||||
|
if (!previewId) {
|
||||||
|
reply.code(400).send({ error: 'Preview ID is required' });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await this.importService.applyImport(previewId, actorId);
|
||||||
|
reply.code(200).send(result);
|
||||||
|
} catch (error: any) {
|
||||||
|
logger.error('Error applying import', { error });
|
||||||
|
if (error.message?.includes('expired') || error.message?.includes('not found')) {
|
||||||
|
reply.code(404).send({ error: error.message });
|
||||||
|
} else if (error.message?.includes('validation errors')) {
|
||||||
|
reply.code(400).send({ error: error.message });
|
||||||
|
} else {
|
||||||
|
reply.code(500).send({ error: error.message || 'Failed to apply import' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async exportCatalog(
|
||||||
|
_request: FastifyRequest,
|
||||||
|
reply: FastifyReply
|
||||||
|
): Promise<void> {
|
||||||
|
try {
|
||||||
|
if (!this.importService) {
|
||||||
|
reply.code(500).send({ error: 'Import service not configured' });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const csvContent = await this.importService.exportCatalog();
|
||||||
|
|
||||||
|
reply
|
||||||
|
.header('Content-Type', 'text/csv')
|
||||||
|
.header('Content-Disposition', 'attachment; filename="vehicle-catalog.csv"')
|
||||||
|
.code(200)
|
||||||
|
.send(csvContent);
|
||||||
|
} catch (error: any) {
|
||||||
|
logger.error('Error exporting catalog', { error });
|
||||||
|
reply.code(500).send({ error: 'Failed to export catalog' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// BULK DELETE ENDPOINT
|
// BULK DELETE ENDPOINT
|
||||||
|
|
||||||
async bulkDeleteCatalogEntity(
|
async bulkDeleteCatalogEntity(
|
||||||
|
|||||||
476
backend/src/features/admin/domain/catalog-import.service.ts
Normal file
476
backend/src/features/admin/domain/catalog-import.service.ts
Normal file
@@ -0,0 +1,476 @@
|
|||||||
|
/**
|
||||||
|
* @ai-summary Catalog CSV import/export service
|
||||||
|
* @ai-context Handles bulk import with preview and export of vehicle catalog data
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { Pool } from 'pg';
|
||||||
|
import { logger } from '../../../core/logging/logger';
|
||||||
|
|
||||||
|
export interface ImportRow {
|
||||||
|
action: 'add' | 'update' | 'delete';
|
||||||
|
year: number;
|
||||||
|
make: string;
|
||||||
|
model: string;
|
||||||
|
trim: string;
|
||||||
|
engineName: string | null;
|
||||||
|
transmissionType: string | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ImportError {
|
||||||
|
row: number;
|
||||||
|
error: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ImportPreviewResult {
|
||||||
|
previewId: string;
|
||||||
|
toCreate: ImportRow[];
|
||||||
|
toUpdate: ImportRow[];
|
||||||
|
toDelete: ImportRow[];
|
||||||
|
errors: ImportError[];
|
||||||
|
valid: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ImportApplyResult {
|
||||||
|
created: number;
|
||||||
|
updated: number;
|
||||||
|
deleted: number;
|
||||||
|
errors: ImportError[];
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ExportRow {
|
||||||
|
year: number;
|
||||||
|
make: string;
|
||||||
|
model: string;
|
||||||
|
trim: string;
|
||||||
|
engineName: string | null;
|
||||||
|
transmissionType: string | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// In-memory preview cache (expires after 15 minutes)
|
||||||
|
const previewCache = new Map<string, { data: ImportPreviewResult; expiresAt: number }>();
|
||||||
|
|
||||||
|
export class CatalogImportService {
|
||||||
|
constructor(private pool: Pool) {}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Parse CSV content and validate without applying changes
|
||||||
|
*/
|
||||||
|
async previewImport(csvContent: string): Promise<ImportPreviewResult> {
|
||||||
|
const previewId = uuidv4();
|
||||||
|
const toCreate: ImportRow[] = [];
|
||||||
|
const toUpdate: ImportRow[] = [];
|
||||||
|
const toDelete: ImportRow[] = [];
|
||||||
|
const errors: ImportError[] = [];
|
||||||
|
|
||||||
|
const lines = csvContent.trim().split('\n');
|
||||||
|
if (lines.length < 2) {
|
||||||
|
return {
|
||||||
|
previewId,
|
||||||
|
toCreate,
|
||||||
|
toUpdate,
|
||||||
|
toDelete,
|
||||||
|
errors: [{ row: 0, error: 'CSV must have a header row and at least one data row' }],
|
||||||
|
valid: false,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse header row
|
||||||
|
const header = this.parseCSVLine(lines[0]);
|
||||||
|
const headerLower = header.map(h => h.toLowerCase().trim());
|
||||||
|
|
||||||
|
// Validate required headers
|
||||||
|
const requiredHeaders = ['action', 'year', 'make', 'model', 'trim'];
|
||||||
|
for (const required of requiredHeaders) {
|
||||||
|
if (!headerLower.includes(required)) {
|
||||||
|
return {
|
||||||
|
previewId,
|
||||||
|
toCreate,
|
||||||
|
toUpdate,
|
||||||
|
toDelete,
|
||||||
|
errors: [{ row: 1, error: `Missing required header: ${required}` }],
|
||||||
|
valid: false,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Find column indices
|
||||||
|
const colIndices = {
|
||||||
|
action: headerLower.indexOf('action'),
|
||||||
|
year: headerLower.indexOf('year'),
|
||||||
|
make: headerLower.indexOf('make'),
|
||||||
|
model: headerLower.indexOf('model'),
|
||||||
|
trim: headerLower.indexOf('trim'),
|
||||||
|
engineName: headerLower.indexOf('engine_name'),
|
||||||
|
transmissionType: headerLower.indexOf('transmission_type'),
|
||||||
|
};
|
||||||
|
|
||||||
|
// Parse data rows
|
||||||
|
for (let i = 1; i < lines.length; i++) {
|
||||||
|
const line = lines[i].trim();
|
||||||
|
if (!line) continue;
|
||||||
|
|
||||||
|
const values = this.parseCSVLine(line);
|
||||||
|
const rowNum = i + 1;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const action = values[colIndices.action]?.toLowerCase().trim();
|
||||||
|
const year = parseInt(values[colIndices.year], 10);
|
||||||
|
const make = values[colIndices.make]?.trim();
|
||||||
|
const model = values[colIndices.model]?.trim();
|
||||||
|
const trim = values[colIndices.trim]?.trim();
|
||||||
|
const engineName = colIndices.engineName >= 0 ? values[colIndices.engineName]?.trim() || null : null;
|
||||||
|
const transmissionType = colIndices.transmissionType >= 0 ? values[colIndices.transmissionType]?.trim() || null : null;
|
||||||
|
|
||||||
|
// Validate action
|
||||||
|
if (!['add', 'update', 'delete'].includes(action)) {
|
||||||
|
errors.push({ row: rowNum, error: `Invalid action: ${action}. Must be add, update, or delete` });
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate year
|
||||||
|
if (isNaN(year) || year < 1900 || year > 2100) {
|
||||||
|
errors.push({ row: rowNum, error: `Invalid year: ${values[colIndices.year]}` });
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate required fields
|
||||||
|
if (!make) {
|
||||||
|
errors.push({ row: rowNum, error: 'Make is required' });
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
if (!model) {
|
||||||
|
errors.push({ row: rowNum, error: 'Model is required' });
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
if (!trim) {
|
||||||
|
errors.push({ row: rowNum, error: 'Trim is required' });
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const row: ImportRow = {
|
||||||
|
action: action as 'add' | 'update' | 'delete',
|
||||||
|
year,
|
||||||
|
make,
|
||||||
|
model,
|
||||||
|
trim,
|
||||||
|
engineName,
|
||||||
|
transmissionType,
|
||||||
|
};
|
||||||
|
|
||||||
|
// Check if record exists for validation
|
||||||
|
const existsResult = await this.pool.query(
|
||||||
|
`SELECT id FROM vehicle_options
|
||||||
|
WHERE year = $1 AND make = $2 AND model = $3 AND trim = $4
|
||||||
|
LIMIT 1`,
|
||||||
|
[year, make, model, trim]
|
||||||
|
);
|
||||||
|
const exists = (existsResult.rowCount || 0) > 0;
|
||||||
|
|
||||||
|
if (action === 'add' && exists) {
|
||||||
|
errors.push({ row: rowNum, error: `Record already exists: ${year} ${make} ${model} ${trim}` });
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
if ((action === 'update' || action === 'delete') && !exists) {
|
||||||
|
errors.push({ row: rowNum, error: `Record not found: ${year} ${make} ${model} ${trim}` });
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Sort into appropriate bucket
|
||||||
|
switch (action) {
|
||||||
|
case 'add':
|
||||||
|
toCreate.push(row);
|
||||||
|
break;
|
||||||
|
case 'update':
|
||||||
|
toUpdate.push(row);
|
||||||
|
break;
|
||||||
|
case 'delete':
|
||||||
|
toDelete.push(row);
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
} catch (error: any) {
|
||||||
|
errors.push({ row: rowNum, error: error.message || 'Parse error' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const result: ImportPreviewResult = {
|
||||||
|
previewId,
|
||||||
|
toCreate,
|
||||||
|
toUpdate,
|
||||||
|
toDelete,
|
||||||
|
errors,
|
||||||
|
valid: errors.length === 0,
|
||||||
|
};
|
||||||
|
|
||||||
|
// Cache preview for 15 minutes
|
||||||
|
previewCache.set(previewId, {
|
||||||
|
data: result,
|
||||||
|
expiresAt: Date.now() + 15 * 60 * 1000,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Clean up expired previews
|
||||||
|
this.cleanupExpiredPreviews();
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Apply a previously previewed import
|
||||||
|
*/
|
||||||
|
async applyImport(previewId: string, changedBy: string): Promise<ImportApplyResult> {
|
||||||
|
const cached = previewCache.get(previewId);
|
||||||
|
if (!cached || cached.expiresAt < Date.now()) {
|
||||||
|
throw new Error('Preview expired or not found. Please upload the file again.');
|
||||||
|
}
|
||||||
|
|
||||||
|
const preview = cached.data;
|
||||||
|
if (!preview.valid) {
|
||||||
|
throw new Error('Cannot apply import with validation errors');
|
||||||
|
}
|
||||||
|
|
||||||
|
const result: ImportApplyResult = {
|
||||||
|
created: 0,
|
||||||
|
updated: 0,
|
||||||
|
deleted: 0,
|
||||||
|
errors: [],
|
||||||
|
};
|
||||||
|
|
||||||
|
const client = await this.pool.connect();
|
||||||
|
try {
|
||||||
|
await client.query('BEGIN');
|
||||||
|
|
||||||
|
// Process creates
|
||||||
|
for (const row of preview.toCreate) {
|
||||||
|
try {
|
||||||
|
// Get or create engine
|
||||||
|
let engineId: number | null = null;
|
||||||
|
if (row.engineName) {
|
||||||
|
const engineResult = await client.query(
|
||||||
|
`INSERT INTO engines (name, fuel_type)
|
||||||
|
VALUES ($1, 'Gas')
|
||||||
|
ON CONFLICT (LOWER(name)) DO UPDATE SET name = EXCLUDED.name
|
||||||
|
RETURNING id`,
|
||||||
|
[row.engineName]
|
||||||
|
);
|
||||||
|
engineId = engineResult.rows[0].id;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get or create transmission
|
||||||
|
let transmissionId: number | null = null;
|
||||||
|
if (row.transmissionType) {
|
||||||
|
const transResult = await client.query(
|
||||||
|
`INSERT INTO transmissions (type)
|
||||||
|
VALUES ($1)
|
||||||
|
ON CONFLICT (LOWER(type)) DO UPDATE SET type = EXCLUDED.type
|
||||||
|
RETURNING id`,
|
||||||
|
[row.transmissionType]
|
||||||
|
);
|
||||||
|
transmissionId = transResult.rows[0].id;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Insert vehicle option
|
||||||
|
await client.query(
|
||||||
|
`INSERT INTO vehicle_options (year, make, model, trim, engine_id, transmission_id)
|
||||||
|
VALUES ($1, $2, $3, $4, $5, $6)`,
|
||||||
|
[row.year, row.make, row.model, row.trim, engineId, transmissionId]
|
||||||
|
);
|
||||||
|
|
||||||
|
result.created++;
|
||||||
|
} catch (error: any) {
|
||||||
|
result.errors.push({ row: 0, error: `Failed to create ${row.year} ${row.make} ${row.model} ${row.trim}: ${error.message}` });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Process updates
|
||||||
|
for (const row of preview.toUpdate) {
|
||||||
|
try {
|
||||||
|
// Get or create engine
|
||||||
|
let engineId: number | null = null;
|
||||||
|
if (row.engineName) {
|
||||||
|
const engineResult = await client.query(
|
||||||
|
`INSERT INTO engines (name, fuel_type)
|
||||||
|
VALUES ($1, 'Gas')
|
||||||
|
ON CONFLICT (LOWER(name)) DO UPDATE SET name = EXCLUDED.name
|
||||||
|
RETURNING id`,
|
||||||
|
[row.engineName]
|
||||||
|
);
|
||||||
|
engineId = engineResult.rows[0].id;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get or create transmission
|
||||||
|
let transmissionId: number | null = null;
|
||||||
|
if (row.transmissionType) {
|
||||||
|
const transResult = await client.query(
|
||||||
|
`INSERT INTO transmissions (type)
|
||||||
|
VALUES ($1)
|
||||||
|
ON CONFLICT (LOWER(type)) DO UPDATE SET type = EXCLUDED.type
|
||||||
|
RETURNING id`,
|
||||||
|
[row.transmissionType]
|
||||||
|
);
|
||||||
|
transmissionId = transResult.rows[0].id;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update vehicle option
|
||||||
|
await client.query(
|
||||||
|
`UPDATE vehicle_options
|
||||||
|
SET engine_id = $5, transmission_id = $6, updated_at = NOW()
|
||||||
|
WHERE year = $1 AND make = $2 AND model = $3 AND trim = $4`,
|
||||||
|
[row.year, row.make, row.model, row.trim, engineId, transmissionId]
|
||||||
|
);
|
||||||
|
|
||||||
|
result.updated++;
|
||||||
|
} catch (error: any) {
|
||||||
|
result.errors.push({ row: 0, error: `Failed to update ${row.year} ${row.make} ${row.model} ${row.trim}: ${error.message}` });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Process deletes
|
||||||
|
for (const row of preview.toDelete) {
|
||||||
|
try {
|
||||||
|
await client.query(
|
||||||
|
`DELETE FROM vehicle_options
|
||||||
|
WHERE year = $1 AND make = $2 AND model = $3 AND trim = $4`,
|
||||||
|
[row.year, row.make, row.model, row.trim]
|
||||||
|
);
|
||||||
|
result.deleted++;
|
||||||
|
} catch (error: any) {
|
||||||
|
result.errors.push({ row: 0, error: `Failed to delete ${row.year} ${row.make} ${row.model} ${row.trim}: ${error.message}` });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
await client.query('COMMIT');
|
||||||
|
|
||||||
|
// Log the import action
|
||||||
|
await this.pool.query(
|
||||||
|
`INSERT INTO platform_change_log (change_type, resource_type, resource_id, old_value, new_value, changed_by)
|
||||||
|
VALUES ('CREATE', 'import', $1, NULL, $2, $3)`,
|
||||||
|
[
|
||||||
|
previewId,
|
||||||
|
JSON.stringify({ created: result.created, updated: result.updated, deleted: result.deleted }),
|
||||||
|
changedBy,
|
||||||
|
]
|
||||||
|
);
|
||||||
|
|
||||||
|
// Remove preview from cache
|
||||||
|
previewCache.delete(previewId);
|
||||||
|
|
||||||
|
logger.info('Catalog import completed', {
|
||||||
|
previewId,
|
||||||
|
created: result.created,
|
||||||
|
updated: result.updated,
|
||||||
|
deleted: result.deleted,
|
||||||
|
errors: result.errors.length,
|
||||||
|
changedBy,
|
||||||
|
});
|
||||||
|
|
||||||
|
return result;
|
||||||
|
} catch (error) {
|
||||||
|
await client.query('ROLLBACK');
|
||||||
|
throw error;
|
||||||
|
} finally {
|
||||||
|
client.release();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Export all vehicle options as CSV
|
||||||
|
*/
|
||||||
|
async exportCatalog(): Promise<string> {
|
||||||
|
const result = await this.pool.query(`
|
||||||
|
SELECT
|
||||||
|
vo.year,
|
||||||
|
vo.make,
|
||||||
|
vo.model,
|
||||||
|
vo.trim,
|
||||||
|
e.name AS engine_name,
|
||||||
|
t.type AS transmission_type
|
||||||
|
FROM vehicle_options vo
|
||||||
|
LEFT JOIN engines e ON vo.engine_id = e.id
|
||||||
|
LEFT JOIN transmissions t ON vo.transmission_id = t.id
|
||||||
|
ORDER BY vo.year DESC, vo.make ASC, vo.model ASC, vo.trim ASC
|
||||||
|
`);
|
||||||
|
|
||||||
|
// Build CSV
|
||||||
|
const header = 'year,make,model,trim,engine_name,transmission_type';
|
||||||
|
const rows = result.rows.map(row => {
|
||||||
|
return [
|
||||||
|
row.year,
|
||||||
|
this.escapeCSVField(row.make),
|
||||||
|
this.escapeCSVField(row.model),
|
||||||
|
this.escapeCSVField(row.trim),
|
||||||
|
this.escapeCSVField(row.engine_name || ''),
|
||||||
|
this.escapeCSVField(row.transmission_type || ''),
|
||||||
|
].join(',');
|
||||||
|
});
|
||||||
|
|
||||||
|
return [header, ...rows].join('\n');
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Parse a single CSV line, handling quoted fields
|
||||||
|
*/
|
||||||
|
private parseCSVLine(line: string): string[] {
|
||||||
|
const result: string[] = [];
|
||||||
|
let current = '';
|
||||||
|
let inQuotes = false;
|
||||||
|
|
||||||
|
for (let i = 0; i < line.length; i++) {
|
||||||
|
const char = line[i];
|
||||||
|
const nextChar = line[i + 1];
|
||||||
|
|
||||||
|
if (inQuotes) {
|
||||||
|
if (char === '"' && nextChar === '"') {
|
||||||
|
current += '"';
|
||||||
|
i++; // Skip next quote
|
||||||
|
} else if (char === '"') {
|
||||||
|
inQuotes = false;
|
||||||
|
} else {
|
||||||
|
current += char;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if (char === '"') {
|
||||||
|
inQuotes = true;
|
||||||
|
} else if (char === ',') {
|
||||||
|
result.push(current);
|
||||||
|
current = '';
|
||||||
|
} else {
|
||||||
|
current += char;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
result.push(current);
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Escape a field for CSV output
|
||||||
|
*/
|
||||||
|
private escapeCSVField(value: string): string {
|
||||||
|
if (!value) return '';
|
||||||
|
if (value.includes(',') || value.includes('"') || value.includes('\n')) {
|
||||||
|
return `"${value.replace(/"/g, '""')}"`;
|
||||||
|
}
|
||||||
|
return value;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Clean up expired preview cache entries
|
||||||
|
*/
|
||||||
|
private cleanupExpiredPreviews(): void {
|
||||||
|
const now = Date.now();
|
||||||
|
for (const [id, entry] of previewCache.entries()) {
|
||||||
|
if (entry.expiresAt < now) {
|
||||||
|
previewCache.delete(id);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Simple UUID generation without external dependency
|
||||||
|
function uuidv4(): string {
|
||||||
|
return 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g, (c) => {
|
||||||
|
const r = Math.random() * 16 | 0;
|
||||||
|
const v = c === 'x' ? r : (r & 0x3 | 0x8);
|
||||||
|
return v.toString(16);
|
||||||
|
});
|
||||||
|
}
|
||||||
@@ -60,6 +60,34 @@ export interface PlatformChangeLog {
|
|||||||
createdAt: Date;
|
createdAt: Date;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export interface CatalogSearchResult {
|
||||||
|
id: number;
|
||||||
|
year: number;
|
||||||
|
make: string;
|
||||||
|
model: string;
|
||||||
|
trim: string;
|
||||||
|
engineId: number | null;
|
||||||
|
engineName: string | null;
|
||||||
|
transmissionId: number | null;
|
||||||
|
transmissionType: string | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface CatalogSearchResponse {
|
||||||
|
items: CatalogSearchResult[];
|
||||||
|
total: number;
|
||||||
|
page: number;
|
||||||
|
pageSize: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface CascadeDeleteResult {
|
||||||
|
deletedMakes: number;
|
||||||
|
deletedModels: number;
|
||||||
|
deletedYears: number;
|
||||||
|
deletedTrims: number;
|
||||||
|
deletedEngines: number;
|
||||||
|
totalDeleted: number;
|
||||||
|
}
|
||||||
|
|
||||||
const VEHICLE_SCHEMA = 'vehicles';
|
const VEHICLE_SCHEMA = 'vehicles';
|
||||||
|
|
||||||
export class VehicleCatalogService {
|
export class VehicleCatalogService {
|
||||||
@@ -611,6 +639,338 @@ export class VehicleCatalogService {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// SEARCH -----------------------------------------------------------------
|
||||||
|
|
||||||
|
async searchCatalog(
|
||||||
|
query: string,
|
||||||
|
page: number = 1,
|
||||||
|
pageSize: number = 50
|
||||||
|
): Promise<CatalogSearchResponse> {
|
||||||
|
const offset = (page - 1) * pageSize;
|
||||||
|
const sanitizedQuery = query.trim();
|
||||||
|
|
||||||
|
if (!sanitizedQuery) {
|
||||||
|
return { items: [], total: 0, page, pageSize };
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert query to tsquery format - split words and join with &
|
||||||
|
const tsQueryTerms = sanitizedQuery
|
||||||
|
.split(/\s+/)
|
||||||
|
.filter(term => term.length > 0)
|
||||||
|
.map(term => term.replace(/[^\w]/g, ''))
|
||||||
|
.filter(term => term.length > 0)
|
||||||
|
.map(term => `${term}:*`)
|
||||||
|
.join(' & ');
|
||||||
|
|
||||||
|
if (!tsQueryTerms) {
|
||||||
|
return { items: [], total: 0, page, pageSize };
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Count total matching records
|
||||||
|
const countQuery = `
|
||||||
|
SELECT COUNT(*) as total
|
||||||
|
FROM vehicle_options vo
|
||||||
|
WHERE to_tsvector('english', vo.year::text || ' ' || vo.make || ' ' || vo.model || ' ' || vo.trim)
|
||||||
|
@@ to_tsquery('english', $1)
|
||||||
|
`;
|
||||||
|
const countResult = await this.pool.query(countQuery, [tsQueryTerms]);
|
||||||
|
const total = parseInt(countResult.rows[0].total, 10);
|
||||||
|
|
||||||
|
// Fetch paginated results with engine and transmission details
|
||||||
|
const searchQuery = `
|
||||||
|
SELECT
|
||||||
|
vo.id,
|
||||||
|
vo.year,
|
||||||
|
vo.make,
|
||||||
|
vo.model,
|
||||||
|
vo.trim,
|
||||||
|
vo.engine_id,
|
||||||
|
e.name AS engine_name,
|
||||||
|
vo.transmission_id,
|
||||||
|
t.type AS transmission_type
|
||||||
|
FROM vehicle_options vo
|
||||||
|
LEFT JOIN engines e ON vo.engine_id = e.id
|
||||||
|
LEFT JOIN transmissions t ON vo.transmission_id = t.id
|
||||||
|
WHERE to_tsvector('english', vo.year::text || ' ' || vo.make || ' ' || vo.model || ' ' || vo.trim)
|
||||||
|
@@ to_tsquery('english', $1)
|
||||||
|
ORDER BY vo.year DESC, vo.make ASC, vo.model ASC, vo.trim ASC
|
||||||
|
LIMIT $2 OFFSET $3
|
||||||
|
`;
|
||||||
|
|
||||||
|
const result = await this.pool.query(searchQuery, [tsQueryTerms, pageSize, offset]);
|
||||||
|
|
||||||
|
const items: CatalogSearchResult[] = result.rows.map((row) => ({
|
||||||
|
id: Number(row.id),
|
||||||
|
year: Number(row.year),
|
||||||
|
make: row.make,
|
||||||
|
model: row.model,
|
||||||
|
trim: row.trim,
|
||||||
|
engineId: row.engine_id ? Number(row.engine_id) : null,
|
||||||
|
engineName: row.engine_name || null,
|
||||||
|
transmissionId: row.transmission_id ? Number(row.transmission_id) : null,
|
||||||
|
transmissionType: row.transmission_type || null,
|
||||||
|
}));
|
||||||
|
|
||||||
|
return { items, total, page, pageSize };
|
||||||
|
} catch (error) {
|
||||||
|
logger.error('Error searching catalog', { error, query: sanitizedQuery });
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// CASCADE DELETE METHODS -------------------------------------------------
|
||||||
|
|
||||||
|
async deleteMakeCascade(makeId: number, changedBy: string): Promise<CascadeDeleteResult> {
|
||||||
|
const result: CascadeDeleteResult = {
|
||||||
|
deletedMakes: 0,
|
||||||
|
deletedModels: 0,
|
||||||
|
deletedYears: 0,
|
||||||
|
deletedTrims: 0,
|
||||||
|
deletedEngines: 0,
|
||||||
|
totalDeleted: 0,
|
||||||
|
};
|
||||||
|
|
||||||
|
await this.runInTransaction(async (client) => {
|
||||||
|
// Verify make exists
|
||||||
|
const makeResult = await client.query(
|
||||||
|
`SELECT id, name, created_at, updated_at FROM ${VEHICLE_SCHEMA}.make WHERE id = $1`,
|
||||||
|
[makeId]
|
||||||
|
);
|
||||||
|
if (makeResult.rowCount === 0) {
|
||||||
|
throw new Error(`Make ${makeId} not found`);
|
||||||
|
}
|
||||||
|
const makeData = this.mapMakeRow(makeResult.rows[0]);
|
||||||
|
|
||||||
|
// Get all models for this make
|
||||||
|
const modelsResult = await client.query(
|
||||||
|
`SELECT id FROM ${VEHICLE_SCHEMA}.model WHERE make_id = $1`,
|
||||||
|
[makeId]
|
||||||
|
);
|
||||||
|
|
||||||
|
// Cascade delete all models and their children
|
||||||
|
for (const modelRow of modelsResult.rows) {
|
||||||
|
const modelDeletes = await this.deleteModelCascadeInTransaction(client, modelRow.id, changedBy);
|
||||||
|
result.deletedModels += modelDeletes.deletedModels;
|
||||||
|
result.deletedYears += modelDeletes.deletedYears;
|
||||||
|
result.deletedTrims += modelDeletes.deletedTrims;
|
||||||
|
result.deletedEngines += modelDeletes.deletedEngines;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Delete the make itself
|
||||||
|
await client.query(`DELETE FROM ${VEHICLE_SCHEMA}.make WHERE id = $1`, [makeId]);
|
||||||
|
await this.logChange(client, 'DELETE', 'makes', makeId.toString(), makeData, null, changedBy);
|
||||||
|
result.deletedMakes = 1;
|
||||||
|
});
|
||||||
|
|
||||||
|
result.totalDeleted = result.deletedMakes + result.deletedModels + result.deletedYears + result.deletedTrims + result.deletedEngines;
|
||||||
|
await this.cacheService.invalidateVehicleData();
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
async deleteModelCascade(modelId: number, changedBy: string): Promise<CascadeDeleteResult> {
|
||||||
|
const result: CascadeDeleteResult = {
|
||||||
|
deletedMakes: 0,
|
||||||
|
deletedModels: 0,
|
||||||
|
deletedYears: 0,
|
||||||
|
deletedTrims: 0,
|
||||||
|
deletedEngines: 0,
|
||||||
|
totalDeleted: 0,
|
||||||
|
};
|
||||||
|
|
||||||
|
await this.runInTransaction(async (client) => {
|
||||||
|
const deletes = await this.deleteModelCascadeInTransaction(client, modelId, changedBy);
|
||||||
|
result.deletedModels = deletes.deletedModels;
|
||||||
|
result.deletedYears = deletes.deletedYears;
|
||||||
|
result.deletedTrims = deletes.deletedTrims;
|
||||||
|
result.deletedEngines = deletes.deletedEngines;
|
||||||
|
});
|
||||||
|
|
||||||
|
result.totalDeleted = result.deletedModels + result.deletedYears + result.deletedTrims + result.deletedEngines;
|
||||||
|
await this.cacheService.invalidateVehicleData();
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
async deleteYearCascade(yearId: number, changedBy: string): Promise<CascadeDeleteResult> {
|
||||||
|
const result: CascadeDeleteResult = {
|
||||||
|
deletedMakes: 0,
|
||||||
|
deletedModels: 0,
|
||||||
|
deletedYears: 0,
|
||||||
|
deletedTrims: 0,
|
||||||
|
deletedEngines: 0,
|
||||||
|
totalDeleted: 0,
|
||||||
|
};
|
||||||
|
|
||||||
|
await this.runInTransaction(async (client) => {
|
||||||
|
const deletes = await this.deleteYearCascadeInTransaction(client, yearId, changedBy);
|
||||||
|
result.deletedYears = deletes.deletedYears;
|
||||||
|
result.deletedTrims = deletes.deletedTrims;
|
||||||
|
result.deletedEngines = deletes.deletedEngines;
|
||||||
|
});
|
||||||
|
|
||||||
|
result.totalDeleted = result.deletedYears + result.deletedTrims + result.deletedEngines;
|
||||||
|
await this.cacheService.invalidateVehicleData();
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
async deleteTrimCascade(trimId: number, changedBy: string): Promise<CascadeDeleteResult> {
|
||||||
|
const result: CascadeDeleteResult = {
|
||||||
|
deletedMakes: 0,
|
||||||
|
deletedModels: 0,
|
||||||
|
deletedYears: 0,
|
||||||
|
deletedTrims: 0,
|
||||||
|
deletedEngines: 0,
|
||||||
|
totalDeleted: 0,
|
||||||
|
};
|
||||||
|
|
||||||
|
await this.runInTransaction(async (client) => {
|
||||||
|
const deletes = await this.deleteTrimCascadeInTransaction(client, trimId, changedBy);
|
||||||
|
result.deletedTrims = deletes.deletedTrims;
|
||||||
|
result.deletedEngines = deletes.deletedEngines;
|
||||||
|
});
|
||||||
|
|
||||||
|
result.totalDeleted = result.deletedTrims + result.deletedEngines;
|
||||||
|
await this.cacheService.invalidateVehicleData();
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Private cascade helpers (run within existing transaction)
|
||||||
|
|
||||||
|
private async deleteModelCascadeInTransaction(
|
||||||
|
client: PoolClient,
|
||||||
|
modelId: number,
|
||||||
|
changedBy: string
|
||||||
|
): Promise<{ deletedModels: number; deletedYears: number; deletedTrims: number; deletedEngines: number }> {
|
||||||
|
const result = { deletedModels: 0, deletedYears: 0, deletedTrims: 0, deletedEngines: 0 };
|
||||||
|
|
||||||
|
// Verify model exists
|
||||||
|
const modelResult = await client.query(
|
||||||
|
`SELECT id, make_id, name, created_at, updated_at FROM ${VEHICLE_SCHEMA}.model WHERE id = $1`,
|
||||||
|
[modelId]
|
||||||
|
);
|
||||||
|
if (modelResult.rowCount === 0) {
|
||||||
|
throw new Error(`Model ${modelId} not found`);
|
||||||
|
}
|
||||||
|
const modelData = this.mapModelRow(modelResult.rows[0]);
|
||||||
|
|
||||||
|
// Get all years for this model
|
||||||
|
const yearsResult = await client.query(
|
||||||
|
`SELECT id FROM ${VEHICLE_SCHEMA}.model_year WHERE model_id = $1`,
|
||||||
|
[modelId]
|
||||||
|
);
|
||||||
|
|
||||||
|
// Cascade delete all years and their children
|
||||||
|
for (const yearRow of yearsResult.rows) {
|
||||||
|
const yearDeletes = await this.deleteYearCascadeInTransaction(client, yearRow.id, changedBy);
|
||||||
|
result.deletedYears += yearDeletes.deletedYears;
|
||||||
|
result.deletedTrims += yearDeletes.deletedTrims;
|
||||||
|
result.deletedEngines += yearDeletes.deletedEngines;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Delete the model itself
|
||||||
|
await client.query(`DELETE FROM ${VEHICLE_SCHEMA}.model WHERE id = $1`, [modelId]);
|
||||||
|
await this.logChange(client, 'DELETE', 'models', modelId.toString(), modelData, null, changedBy);
|
||||||
|
result.deletedModels = 1;
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
private async deleteYearCascadeInTransaction(
|
||||||
|
client: PoolClient,
|
||||||
|
yearId: number,
|
||||||
|
changedBy: string
|
||||||
|
): Promise<{ deletedYears: number; deletedTrims: number; deletedEngines: number }> {
|
||||||
|
const result = { deletedYears: 0, deletedTrims: 0, deletedEngines: 0 };
|
||||||
|
|
||||||
|
// Verify year exists
|
||||||
|
const yearResult = await client.query(
|
||||||
|
`SELECT id, model_id, year, created_at, updated_at FROM ${VEHICLE_SCHEMA}.model_year WHERE id = $1`,
|
||||||
|
[yearId]
|
||||||
|
);
|
||||||
|
if (yearResult.rowCount === 0) {
|
||||||
|
throw new Error(`Year ${yearId} not found`);
|
||||||
|
}
|
||||||
|
const yearData = this.mapYearRow(yearResult.rows[0]);
|
||||||
|
|
||||||
|
// Get all trims for this year
|
||||||
|
const trimsResult = await client.query(
|
||||||
|
`SELECT id FROM ${VEHICLE_SCHEMA}.trim WHERE model_year_id = $1`,
|
||||||
|
[yearId]
|
||||||
|
);
|
||||||
|
|
||||||
|
// Cascade delete all trims and their engines
|
||||||
|
for (const trimRow of trimsResult.rows) {
|
||||||
|
const trimDeletes = await this.deleteTrimCascadeInTransaction(client, trimRow.id, changedBy);
|
||||||
|
result.deletedTrims += trimDeletes.deletedTrims;
|
||||||
|
result.deletedEngines += trimDeletes.deletedEngines;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Delete the year itself
|
||||||
|
await client.query(`DELETE FROM ${VEHICLE_SCHEMA}.model_year WHERE id = $1`, [yearId]);
|
||||||
|
await this.logChange(client, 'DELETE', 'years', yearId.toString(), yearData, null, changedBy);
|
||||||
|
result.deletedYears = 1;
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
private async deleteTrimCascadeInTransaction(
|
||||||
|
client: PoolClient,
|
||||||
|
trimId: number,
|
||||||
|
changedBy: string
|
||||||
|
): Promise<{ deletedTrims: number; deletedEngines: number }> {
|
||||||
|
const result = { deletedTrims: 0, deletedEngines: 0 };
|
||||||
|
|
||||||
|
// Verify trim exists
|
||||||
|
const trimResult = await client.query(
|
||||||
|
`SELECT id, model_year_id, name, created_at, updated_at FROM ${VEHICLE_SCHEMA}.trim WHERE id = $1`,
|
||||||
|
[trimId]
|
||||||
|
);
|
||||||
|
if (trimResult.rowCount === 0) {
|
||||||
|
throw new Error(`Trim ${trimId} not found`);
|
||||||
|
}
|
||||||
|
const trimData = this.mapTrimRow(trimResult.rows[0]);
|
||||||
|
|
||||||
|
// Get all engines linked to this trim
|
||||||
|
const enginesResult = await client.query(
|
||||||
|
`SELECT e.id, e.name, e.displacement_l, e.cylinders, e.fuel_type, e.created_at, e.updated_at, te.trim_id
|
||||||
|
FROM ${VEHICLE_SCHEMA}.engine e
|
||||||
|
JOIN ${VEHICLE_SCHEMA}.trim_engine te ON te.engine_id = e.id
|
||||||
|
WHERE te.trim_id = $1`,
|
||||||
|
[trimId]
|
||||||
|
);
|
||||||
|
|
||||||
|
// Delete engine links and potentially orphaned engines
|
||||||
|
for (const engineRow of enginesResult.rows) {
|
||||||
|
const engineData = this.mapEngineRow(engineRow);
|
||||||
|
|
||||||
|
// Remove the link first
|
||||||
|
await client.query(
|
||||||
|
`DELETE FROM ${VEHICLE_SCHEMA}.trim_engine WHERE trim_id = $1 AND engine_id = $2`,
|
||||||
|
[trimId, engineRow.id]
|
||||||
|
);
|
||||||
|
|
||||||
|
// Check if this engine is used by other trims
|
||||||
|
const otherLinksResult = await client.query(
|
||||||
|
`SELECT 1 FROM ${VEHICLE_SCHEMA}.trim_engine WHERE engine_id = $1 LIMIT 1`,
|
||||||
|
[engineRow.id]
|
||||||
|
);
|
||||||
|
|
||||||
|
// If no other trims use this engine, delete the engine itself
|
||||||
|
if ((otherLinksResult.rowCount || 0) === 0) {
|
||||||
|
await client.query(`DELETE FROM ${VEHICLE_SCHEMA}.engine WHERE id = $1`, [engineRow.id]);
|
||||||
|
await this.logChange(client, 'DELETE', 'engines', engineRow.id.toString(), engineData, null, changedBy);
|
||||||
|
result.deletedEngines += 1;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Delete the trim itself
|
||||||
|
await client.query(`DELETE FROM ${VEHICLE_SCHEMA}.trim WHERE id = $1`, [trimId]);
|
||||||
|
await this.logChange(client, 'DELETE', 'trims', trimId.toString(), trimData, null, changedBy);
|
||||||
|
result.deletedTrims = 1;
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
// HELPERS ----------------------------------------------------------------
|
// HELPERS ----------------------------------------------------------------
|
||||||
|
|
||||||
private async runInTransaction<T>(handler: (client: PoolClient) => Promise<T>): Promise<T> {
|
private async runInTransaction<T>(handler: (client: PoolClient) => Promise<T>): Promise<T> {
|
||||||
|
|||||||
@@ -0,0 +1,14 @@
|
|||||||
|
-- Migration: Add full-text search index for vehicle catalog search
|
||||||
|
-- Date: 2025-12-15
|
||||||
|
|
||||||
|
-- Add full-text search index on vehicle_options table
|
||||||
|
-- Combines year, make, model, and trim into a single searchable tsvector
|
||||||
|
-- Using || operator instead of concat() because || is IMMUTABLE
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_vehicle_options_fts ON vehicle_options
|
||||||
|
USING gin(to_tsvector('english', year::text || ' ' || make || ' ' || model || ' ' || trim));
|
||||||
|
|
||||||
|
-- Add an index on engines.name for join performance during search
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_engines_name ON engines(name);
|
||||||
|
|
||||||
|
-- Add comment for documentation
|
||||||
|
COMMENT ON INDEX idx_vehicle_options_fts IS 'Full-text search index for admin catalog search functionality';
|
||||||
@@ -1,53 +0,0 @@
|
|||||||
acura
|
|
||||||
alfa_romeo
|
|
||||||
aston_martin
|
|
||||||
audi
|
|
||||||
bentley
|
|
||||||
bmw
|
|
||||||
buick
|
|
||||||
cadillac
|
|
||||||
chevrolet
|
|
||||||
chrysler
|
|
||||||
dodge
|
|
||||||
ferrari
|
|
||||||
fiat
|
|
||||||
ford
|
|
||||||
genesis
|
|
||||||
gmc
|
|
||||||
honda
|
|
||||||
hummer
|
|
||||||
hyundai
|
|
||||||
infiniti
|
|
||||||
isuzu
|
|
||||||
jaguar
|
|
||||||
jeep
|
|
||||||
kia
|
|
||||||
lamborghini
|
|
||||||
land_rover
|
|
||||||
lexus
|
|
||||||
lincoln
|
|
||||||
lotus
|
|
||||||
lucid
|
|
||||||
maserati
|
|
||||||
mazda
|
|
||||||
mclaren
|
|
||||||
mercury
|
|
||||||
mini
|
|
||||||
mitsubishi
|
|
||||||
nissan
|
|
||||||
oldsmobile
|
|
||||||
plymouth
|
|
||||||
polestar
|
|
||||||
pontiac
|
|
||||||
porsche
|
|
||||||
ram
|
|
||||||
rivian
|
|
||||||
rolls_royce
|
|
||||||
saab
|
|
||||||
scion
|
|
||||||
smart
|
|
||||||
subaru
|
|
||||||
tesla
|
|
||||||
toyota
|
|
||||||
volkswagen
|
|
||||||
volvo
|
|
||||||
41
data/vehicle-etl/README.md
Normal file
41
data/vehicle-etl/README.md
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
Step 1: Fetch Data from VehAPI
|
||||||
|
|
||||||
|
cd data/vehicle-etl
|
||||||
|
python3 vehapi_fetch_snapshot.py --min-year 2015 --max-year 2025
|
||||||
|
|
||||||
|
Options:
|
||||||
|
| Flag | Default | Description |
|
||||||
|
|---------------------|-------------------|------------------------|
|
||||||
|
| --min-year | 2015 | Start year |
|
||||||
|
| --max-year | 2022 | End year |
|
||||||
|
| --rate-per-min | 55 | API rate limit |
|
||||||
|
| --snapshot-dir | snapshots/<today> | Output directory |
|
||||||
|
| --no-response-cache | false | Disable resume caching |
|
||||||
|
|
||||||
|
Output: Creates snapshots/<date>/snapshot.sqlite
|
||||||
|
|
||||||
|
---
|
||||||
|
Step 2: Generate SQL Files
|
||||||
|
|
||||||
|
python3 etl_generate_sql.py --snapshot-path snapshots/<date>/snapshot.sqlite
|
||||||
|
|
||||||
|
Output: Creates output/01_engines.sql, output/02_transmissions.sql, output/03_vehicle_options.sql
|
||||||
|
|
||||||
|
---
|
||||||
|
Step 3: Import to PostgreSQL
|
||||||
|
|
||||||
|
./import_data.sh
|
||||||
|
|
||||||
|
Requires: mvp-postgres container running, SQL files in output/
|
||||||
|
|
||||||
|
---
|
||||||
|
Quick Test (single year)
|
||||||
|
|
||||||
|
python3 vehapi_fetch_snapshot.py --min-year 2020 --max-year 2020
|
||||||
|
|
||||||
|
# Full ETL workflow
|
||||||
|
./reset_database.sh # Clear old data
|
||||||
|
python3 vehapi_fetch_snapshot.py # Fetch from API
|
||||||
|
python3 etl_generate_sql.py # Generate SQL
|
||||||
|
./import_data.sh # Import to Postgres
|
||||||
|
docker compose exec mvp-redis redis-cli FLUSHALL # Flush Redis Cache for front end
|
||||||
Binary file not shown.
@@ -1,22 +1,128 @@
|
|||||||
-- Auto-generated by etl_generate_sql.py
|
-- Auto-generated by etl_generate_sql.py
|
||||||
INSERT INTO engines (id, name, fuel_type) VALUES
|
INSERT INTO engines (id, name, fuel_type) VALUES
|
||||||
(1,'Gas','Gas'),
|
(1,'Gas','Gas'),
|
||||||
(2,'2.0L 150 hp I4','Gas'),
|
(2,'2.4L 201 hp I4','Gas'),
|
||||||
(3,'2.4L 201 hp I4','Gas'),
|
(3,'3.5L 290 hp V6','Gas'),
|
||||||
(4,'3.5L 290 hp V6','Gas'),
|
(4,'3.0L 321 hp V6 Hybrid','Hybrid'),
|
||||||
(5,'3.5L 273 hp V6','Gas'),
|
(5,'3.5L 573 hp V6 Hybrid','Hybrid'),
|
||||||
(6,'3.5L 310 hp V6','Gas'),
|
(6,'3.5L 279 hp V6','Gas'),
|
||||||
(7,'2.4L 206 hp I4','Gas'),
|
(7,'3.5L 310 hp V6','Gas'),
|
||||||
(8,'2.0L 220 hp I4','Gas'),
|
(8,'3.5L 377 hp V6 Hybrid','Hybrid'),
|
||||||
(9,'1.8L 170 hp I4','Gas'),
|
(9,'2.4L 206 hp I4','Gas'),
|
||||||
(10,'Diesel','Diesel'),
|
(10,'2.0L 220 hp I4','Gas'),
|
||||||
(11,'2.0L 150 hp I4 Diesel','Diesel'),
|
(11,'2.0L 186 hp I4','Gas'),
|
||||||
(12,'2.0L 220 hp I4 Flex Fuel Vehicle','Gas'),
|
(12,'1.4L 204 hp I4','Gas'),
|
||||||
(13,'3.0L 310 hp V6','Gas'),
|
(13,'2.0L 190 hp I4','Gas'),
|
||||||
(14,'3.0L 240 hp V6 Diesel','Diesel'),
|
(14,'2.0L 252 hp I4','Gas'),
|
||||||
(15,'4.0L 435 hp V8','Diesel'),
|
(15,'2.0L 220 hp I4 Flex Fuel Vehicle','Gas'),
|
||||||
(16,'3.0L 333 hp V6','Gas'),
|
(16,'3.0L 333 hp V6','Gas'),
|
||||||
(17,'6.3L 500 hp W12','Gas'),
|
(17,'3.0L 340 hp V6','Gas'),
|
||||||
(18,'2.0L 200 hp I4','Gas'),
|
(18,'4.0L 450 hp V8','Gas'),
|
||||||
(19,'3.0L 272 hp V6','Gas');
|
(19,'2.0L 200 hp I4','Gas'),
|
||||||
|
(20,'3.0L 272 hp V6','Gas'),
|
||||||
|
(21,'Diesel','Diesel'),
|
||||||
|
(22,'5.2L 540 hp V10','Gas'),
|
||||||
|
(23,'5.2L 610 hp V10','Gas'),
|
||||||
|
(24,'2.5L 400 hp I5','Gas'),
|
||||||
|
(25,'4.0L 560 hp V8','Gas'),
|
||||||
|
(26,'4.0L 605 hp V8','Gas'),
|
||||||
|
(27,'2.0L 292 hp I4','Gas'),
|
||||||
|
(28,'3.0L 354 hp V6','Gas'),
|
||||||
|
(29,'2.0L 248 hp I4','Gas'),
|
||||||
|
(30,'3.0L 335 hp I6','Gas'),
|
||||||
|
(31,'2.0L 180 hp I4','Gas'),
|
||||||
|
(32,'2.0L 180 hp I4 Diesel','Diesel'),
|
||||||
|
(33,'3.0L 320 hp I6','Gas'),
|
||||||
|
(34,'3.0L 300 hp I6','Gas'),
|
||||||
|
(35,'4.4L 445 hp V8','Gas'),
|
||||||
|
(36,'3.0L 315 hp I6','Gas'),
|
||||||
|
(37,'4.4L 600 hp V8','Gas'),
|
||||||
|
(38,'2.0L 322 hp I4','Gas'),
|
||||||
|
(39,'6.6L 601 hp V12','Gas'),
|
||||||
|
(40,'3.0L 365 hp I6','Gas'),
|
||||||
|
(41,'3.0L 425 hp I6','Gas'),
|
||||||
|
(42,'4.4L 552 hp V8','Gas'),
|
||||||
|
(43,'2.0L 228 hp I4','Gas'),
|
||||||
|
(44,'2.0L 240 hp I4','Gas'),
|
||||||
|
(45,'3.0L 355 hp I6','Gas'),
|
||||||
|
(46,'3.0L 255 hp I6 Diesel','Diesel'),
|
||||||
|
(47,'2.0L 308 hp I4','Gas'),
|
||||||
|
(48,'4.4L 567 hp V8','Gas'),
|
||||||
|
(49,'0.7L 168 hp I2','Gas'),
|
||||||
|
(50,'170 hp Electric','Electric'),
|
||||||
|
(51,'168 hp Electric','Electric'),
|
||||||
|
(52,'0.7L 170 hp I2','Gas'),
|
||||||
|
(53,'1.5L 357 hp I3','Gas'),
|
||||||
|
(54,'6.0L 600 hp W12','Gas'),
|
||||||
|
(55,'6.0L 633 hp W12','Gas'),
|
||||||
|
(56,'4.0L 500 hp V8','Gas'),
|
||||||
|
(57,'4.0L 521 hp V8','Gas'),
|
||||||
|
(58,'6.0L 582 hp W12','Gas'),
|
||||||
|
(59,'6.0L 700 hp W12','Gas'),
|
||||||
|
(60,'6.0L 616 hp W12','Gas'),
|
||||||
|
(61,'6.0L 626 hp W12','Gas'),
|
||||||
|
(62,'6.8L 505 hp V8','Gas'),
|
||||||
|
(63,'6.8L 530 hp V8','Gas'),
|
||||||
|
(64,'1.6L 200 hp I4','Gas'),
|
||||||
|
(65,'3.6L 288 hp V6','Gas'),
|
||||||
|
(66,'1.4L 138 hp I4','Gas'),
|
||||||
|
(67,'1.4L 153 hp I4','Gas'),
|
||||||
|
(68,'2.5L 197 hp I4','Gas'),
|
||||||
|
(69,'3.6L 310 hp V6','Gas'),
|
||||||
|
(70,'2.4L 182 hp I4','Gas'),
|
||||||
|
(71,'2.4L 182 hp I4 Flex Fuel Vehicle','Gas'),
|
||||||
|
(72,'2.0L 259 hp I4','Gas'),
|
||||||
|
(73,'2.4L 180 hp I4','Gas'),
|
||||||
|
(74,'2.4L 180 hp I4 Flex Fuel Vehicle','Gas'),
|
||||||
|
(75,'2.0L 272 hp I4','Gas'),
|
||||||
|
(76,'3.6L 335 hp V6','Gas'),
|
||||||
|
(77,'2.5L 202 hp I4','Gas'),
|
||||||
|
(78,'3.6L 464 hp V6','Gas'),
|
||||||
|
(79,'2.0L 265 hp I4','Gas'),
|
||||||
|
(80,'3.0L 404 hp V6','Gas'),
|
||||||
|
(81,'2.0L 335 hp I4','Gas'),
|
||||||
|
(82,'2.0L 268 hp I4','Gas'),
|
||||||
|
(83,'3.6L 420 hp V6','Gas'),
|
||||||
|
(84,'6.2L 640 hp V8','Gas'),
|
||||||
|
(85,'6.2L 420 hp V8','Gas'),
|
||||||
|
(86,'3.6L 304 hp V6','Gas'),
|
||||||
|
(87,'3.6L 410 hp V6','Gas'),
|
||||||
|
(88,'200 hp Electric','Electric'),
|
||||||
|
(89,'2.0L 275 hp I4','Gas'),
|
||||||
|
(90,'6.2L 455 hp V8','Gas'),
|
||||||
|
(91,'6.2L 650 hp V8','Gas'),
|
||||||
|
(92,'3.6L 301 hp V6 Flex Fuel Vehicle','Gas'),
|
||||||
|
(93,'6.0L 355 hp V8 Flex Fuel Vehicle','Gas'),
|
||||||
|
(94,'2.0L 131 hp I4','Gas'),
|
||||||
|
(95,'2.5L 200 hp I4','Gas'),
|
||||||
|
(96,'2.8L 181 hp I4 Diesel','Diesel'),
|
||||||
|
(97,'3.6L 308 hp V6','Gas'),
|
||||||
|
(98,'6.2L 460 hp V8','Gas'),
|
||||||
|
(99,'1.6L 137 hp I4 Diesel','Diesel'),
|
||||||
|
(100,'3.6L 301 hp V6','Gas'),
|
||||||
|
(101,'4.8L 285 hp V8','Gas'),
|
||||||
|
(102,'6.0L 342 hp V8 Flex Fuel Vehicle','Gas'),
|
||||||
|
(103,'3.6L 260 hp V6 Compressed Natural Gas','Gas'),
|
||||||
|
(104,'3.6L 305 hp V6 Flex Fuel Vehicle','Gas'),
|
||||||
|
(105,'1.5L 160 hp I4','Gas'),
|
||||||
|
(106,'2.0L 250 hp I4','Gas'),
|
||||||
|
(107,'1.8L 182 hp I4 Hybrid','Hybrid'),
|
||||||
|
(108,'6.2L 415 hp V8','Gas'),
|
||||||
|
(109,'4.3L 285 hp V6 Flex Fuel Vehicle','Gas'),
|
||||||
|
(110,'5.3L 355 hp V8','Gas'),
|
||||||
|
(111,'6.0L 360 hp V8 Flex Fuel Vehicle','Gas'),
|
||||||
|
(112,'6.6L 445 hp V8 Biodiesel','Diesel'),
|
||||||
|
(113,'1.8L 138 hp I4','Gas'),
|
||||||
|
(114,'1.4L 98 hp I4','Gas'),
|
||||||
|
(115,'5.3L 355 hp V8 Flex Fuel Vehicle','Gas'),
|
||||||
|
(116,'6.0L 360 hp V8','Gas'),
|
||||||
|
(117,'3.6L 281 hp V6','Gas'),
|
||||||
|
(118,'1.5L 149 hp I4','Gas'),
|
||||||
|
(119,'2.4L 184 hp I4','Gas'),
|
||||||
|
(120,'3.6L 295 hp V6','Gas'),
|
||||||
|
(121,'3.6L 292 hp V6','Gas'),
|
||||||
|
(122,'5.7L 363 hp V8','Gas'),
|
||||||
|
(123,'3.6L 300 hp V6','Gas'),
|
||||||
|
(124,'3.6L 287 hp V6','Gas'),
|
||||||
|
(125,'3.6L 260 hp V6','Gas');
|
||||||
|
|
||||||
|
|||||||
@@ -2,12 +2,19 @@
|
|||||||
INSERT INTO transmissions (id, type) VALUES
|
INSERT INTO transmissions (id, type) VALUES
|
||||||
(1,'Automatic'),
|
(1,'Automatic'),
|
||||||
(2,'Manual'),
|
(2,'Manual'),
|
||||||
(3,'5-Speed Automatic'),
|
(3,'8-Speed Dual Clutch'),
|
||||||
(4,'6-Speed Manual'),
|
(4,'9-Speed Automatic'),
|
||||||
(5,'6-Speed Automatic'),
|
(5,'7-Speed Dual Clutch'),
|
||||||
(6,'8-Speed Dual Clutch'),
|
(6,'9-Speed Dual Clutch'),
|
||||||
(7,'9-Speed Automatic'),
|
(7,'6-Speed Automatic'),
|
||||||
(8,'6-Speed Dual Clutch'),
|
(8,'6-Speed Dual Clutch'),
|
||||||
(9,'8-Speed Automatic'),
|
(9,'6-Speed Manual'),
|
||||||
(10,'Continuously Variable Transmission');
|
(10,'8-Speed Automatic'),
|
||||||
|
(11,'1-Speed Dual Clutch'),
|
||||||
|
(12,'6-Speed Automatic Overdrive'),
|
||||||
|
(13,'4-Speed Automatic'),
|
||||||
|
(14,'10-Speed Automatic'),
|
||||||
|
(15,'Continuously Variable Transmission'),
|
||||||
|
(16,'7-Speed Manual'),
|
||||||
|
(17,'5-Speed Manual');
|
||||||
|
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
56
data/vehicle-etl/reset_database.sh
Executable file
56
data/vehicle-etl/reset_database.sh
Executable file
@@ -0,0 +1,56 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# Reset vehicle database tables before a fresh import.
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
cd "$SCRIPT_DIR"
|
||||||
|
|
||||||
|
echo "=========================================="
|
||||||
|
echo "Vehicle Database Reset"
|
||||||
|
echo "=========================================="
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Check if postgres container is running
|
||||||
|
if ! docker ps --filter "name=mvp-postgres" --format "{{.Names}}" | grep -q "mvp-postgres"; then
|
||||||
|
echo "Error: mvp-postgres container is not running"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "Current data (before reset):"
|
||||||
|
docker exec mvp-postgres psql -U postgres -d motovaultpro -c \
|
||||||
|
"SELECT
|
||||||
|
(SELECT COUNT(*) FROM engines) as engines,
|
||||||
|
(SELECT COUNT(*) FROM transmissions) as transmissions,
|
||||||
|
(SELECT COUNT(*) FROM vehicle_options) as vehicle_options;" 2>/dev/null || echo " Tables may not exist yet"
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Confirm reset
|
||||||
|
read -p "Are you sure you want to reset all vehicle data? (y/N) " -n 1 -r
|
||||||
|
echo ""
|
||||||
|
if [[ ! $REPLY =~ ^[Yy]$ ]]; then
|
||||||
|
echo "Reset cancelled."
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "Truncating tables..."
|
||||||
|
docker exec -i mvp-postgres psql -U postgres -d motovaultpro <<'EOF'
|
||||||
|
TRUNCATE TABLE vehicle_options RESTART IDENTITY CASCADE;
|
||||||
|
TRUNCATE TABLE engines RESTART IDENTITY CASCADE;
|
||||||
|
TRUNCATE TABLE transmissions RESTART IDENTITY CASCADE;
|
||||||
|
EOF
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "=========================================="
|
||||||
|
echo "Reset complete"
|
||||||
|
echo "=========================================="
|
||||||
|
echo ""
|
||||||
|
echo "Verification (should all be 0):"
|
||||||
|
docker exec mvp-postgres psql -U postgres -d motovaultpro -c \
|
||||||
|
"SELECT
|
||||||
|
(SELECT COUNT(*) FROM engines) as engines,
|
||||||
|
(SELECT COUNT(*) FROM transmissions) as transmissions,
|
||||||
|
(SELECT COUNT(*) FROM vehicle_options) as vehicle_options;"
|
||||||
|
echo ""
|
||||||
|
echo "Ready for fresh import with: ./import_data.sh"
|
||||||
@@ -32,7 +32,7 @@ except ImportError: # pragma: no cover - env guard
|
|||||||
SCRIPT_VERSION = "vehapi_fetch_snapshot.py@1.1.0"
|
SCRIPT_VERSION = "vehapi_fetch_snapshot.py@1.1.0"
|
||||||
DEFAULT_MIN_YEAR = 2015
|
DEFAULT_MIN_YEAR = 2015
|
||||||
DEFAULT_MAX_YEAR = 2022
|
DEFAULT_MAX_YEAR = 2022
|
||||||
DEFAULT_RATE_PER_SEC = 55 # stays under the 60 req/sec ceiling
|
DEFAULT_RATE_PER_MIN = 55 # stays under the 60 req/min ceiling
|
||||||
MAX_ATTEMPTS = 5
|
MAX_ATTEMPTS = 5
|
||||||
FALLBACK_TRIMS = ["Base"]
|
FALLBACK_TRIMS = ["Base"]
|
||||||
FALLBACK_TRANSMISSIONS = ["Manual", "Automatic"]
|
FALLBACK_TRANSMISSIONS = ["Manual", "Automatic"]
|
||||||
@@ -95,22 +95,18 @@ def ensure_snapshot_dir(root: Path, custom_dir: Optional[str]) -> Path:
|
|||||||
|
|
||||||
|
|
||||||
class RateLimiter:
|
class RateLimiter:
|
||||||
"""Simple leaky bucket limiter to stay below the VehAPI threshold."""
|
"""Fixed delay limiter to stay below the VehAPI threshold (60 req/min)."""
|
||||||
|
|
||||||
def __init__(self, max_per_sec: int) -> None:
|
def __init__(self, max_per_min: int) -> None:
|
||||||
self.max_per_sec = max_per_sec
|
self.delay = 60.0 / max_per_min # ~1.09 sec for 55 rpm
|
||||||
self._history: List[float] = []
|
self._last_request = 0.0
|
||||||
|
|
||||||
def acquire(self) -> None:
|
def acquire(self) -> None:
|
||||||
while True:
|
now = time.monotonic()
|
||||||
now = time.monotonic()
|
elapsed = now - self._last_request
|
||||||
window_start = now - 1
|
if elapsed < self.delay:
|
||||||
self._history = [ts for ts in self._history if ts >= window_start]
|
time.sleep(self.delay - elapsed)
|
||||||
if len(self._history) < self.max_per_sec:
|
self._last_request = time.monotonic()
|
||||||
break
|
|
||||||
sleep_for = max(self._history[0] - window_start, 0.001)
|
|
||||||
time.sleep(sleep_for)
|
|
||||||
self._history.append(time.monotonic())
|
|
||||||
|
|
||||||
|
|
||||||
@dataclass
|
@dataclass
|
||||||
@@ -132,7 +128,7 @@ class VehapiFetcher:
|
|||||||
allowed_makes: Sequence[str],
|
allowed_makes: Sequence[str],
|
||||||
snapshot_path: Path,
|
snapshot_path: Path,
|
||||||
responses_cache: bool = True,
|
responses_cache: bool = True,
|
||||||
rate_per_sec: int = DEFAULT_RATE_PER_SEC,
|
rate_per_min: int = DEFAULT_RATE_PER_MIN,
|
||||||
) -> None:
|
) -> None:
|
||||||
self.session = session
|
self.session = session
|
||||||
self.base_url = base_url.rstrip("/")
|
self.base_url = base_url.rstrip("/")
|
||||||
@@ -146,7 +142,7 @@ class VehapiFetcher:
|
|||||||
self.conn.execute("PRAGMA synchronous=NORMAL;")
|
self.conn.execute("PRAGMA synchronous=NORMAL;")
|
||||||
self._init_schema()
|
self._init_schema()
|
||||||
self.responses_cache = responses_cache
|
self.responses_cache = responses_cache
|
||||||
self.rate_limiter = RateLimiter(rate_per_sec)
|
self.rate_limiter = RateLimiter(rate_per_min)
|
||||||
self.counts = FetchCounts()
|
self.counts = FetchCounts()
|
||||||
|
|
||||||
def _init_schema(self) -> None:
|
def _init_schema(self) -> None:
|
||||||
@@ -251,7 +247,7 @@ class VehapiFetcher:
|
|||||||
retry_seconds = float(retry_after)
|
retry_seconds = float(retry_after)
|
||||||
except (TypeError, ValueError):
|
except (TypeError, ValueError):
|
||||||
retry_seconds = 30.0
|
retry_seconds = 30.0
|
||||||
sleep_for = retry_seconds + random.uniform(0, 3)
|
sleep_for = retry_seconds + random.uniform(0, 0.5)
|
||||||
print(f"[info] {label}: hit 429, sleeping {sleep_for:.1f}s before retry", file=sys.stderr)
|
print(f"[info] {label}: hit 429, sleeping {sleep_for:.1f}s before retry", file=sys.stderr)
|
||||||
time.sleep(sleep_for)
|
time.sleep(sleep_for)
|
||||||
backoff = min(backoff * 2, 30)
|
backoff = min(backoff * 2, 30)
|
||||||
@@ -374,6 +370,7 @@ class VehapiFetcher:
|
|||||||
self._fetch_engines_for_transmission(year, make, model, trim, trans, trans_bucket)
|
self._fetch_engines_for_transmission(year, make, model, trim, trans, trans_bucket)
|
||||||
|
|
||||||
def _fetch_trims_for_model(self, year: int, make: str, model: str) -> None:
|
def _fetch_trims_for_model(self, year: int, make: str, model: str) -> None:
|
||||||
|
print(f" -> {year} {make} {model}", file=sys.stderr)
|
||||||
path = ["trims", str(year), make, model]
|
path = ["trims", str(year), make, model]
|
||||||
label = f"trims:{year}/{make}/{model}"
|
label = f"trims:{year}/{make}/{model}"
|
||||||
trims_payload = self._request_json(path, label)
|
trims_payload = self._request_json(path, label)
|
||||||
@@ -416,9 +413,10 @@ class VehapiFetcher:
|
|||||||
print(f"[info] {year}: no allowed makes found, skipping", file=sys.stderr)
|
print(f"[info] {year}: no allowed makes found, skipping", file=sys.stderr)
|
||||||
continue
|
continue
|
||||||
print(f"[info] {year}: {len(makes)} makes", file=sys.stderr)
|
print(f"[info] {year}: {len(makes)} makes", file=sys.stderr)
|
||||||
for make in makes:
|
for idx, make in enumerate(makes, 1):
|
||||||
print(f"[info] {year} {make}: fetching models", file=sys.stderr)
|
print(f"[{year}] ({idx}/{len(makes)}) {make}", file=sys.stderr)
|
||||||
self._fetch_models_for_make(year, make)
|
self._fetch_models_for_make(year, make)
|
||||||
|
print(f" [{self.counts.pairs_inserted} pairs so far]", file=sys.stderr)
|
||||||
self.conn.commit()
|
self.conn.commit()
|
||||||
return self.counts
|
return self.counts
|
||||||
|
|
||||||
@@ -429,7 +427,7 @@ def build_arg_parser() -> argparse.ArgumentParser:
|
|||||||
parser.add_argument("--max-year", type=int, default=int(read_env("MAX_YEAR", DEFAULT_MAX_YEAR)), help="Inclusive max year (default env MAX_YEAR or 2026)")
|
parser.add_argument("--max-year", type=int, default=int(read_env("MAX_YEAR", DEFAULT_MAX_YEAR)), help="Inclusive max year (default env MAX_YEAR or 2026)")
|
||||||
parser.add_argument("--snapshot-dir", type=str, help="Target snapshot directory (default snapshots/<today>)")
|
parser.add_argument("--snapshot-dir", type=str, help="Target snapshot directory (default snapshots/<today>)")
|
||||||
parser.add_argument("--base-url", type=str, default=read_env("VEHAPI_BASE_URL", DEFAULT_BASE_URL), help="VehAPI base URL (e.g. https://vehapi.com/api/v1/car-lists/get/car)")
|
parser.add_argument("--base-url", type=str, default=read_env("VEHAPI_BASE_URL", DEFAULT_BASE_URL), help="VehAPI base URL (e.g. https://vehapi.com/api/v1/car-lists/get/car)")
|
||||||
parser.add_argument("--rate-per-sec", type=int, default=int(read_env("VEHAPI_MAX_RPS", DEFAULT_RATE_PER_SEC)), help="Max requests per second (<=60)")
|
parser.add_argument("--rate-per-min", type=int, default=int(read_env("VEHAPI_MAX_RPM", DEFAULT_RATE_PER_MIN)), help="Max requests per minute (<=60)")
|
||||||
parser.add_argument("--makes-file", type=str, default="source-makes.txt", help="Path to source-makes.txt")
|
parser.add_argument("--makes-file", type=str, default="source-makes.txt", help="Path to source-makes.txt")
|
||||||
parser.add_argument("--api-key-file", type=str, default="vehapi.key", help="Path to VehAPI bearer token file")
|
parser.add_argument("--api-key-file", type=str, default="vehapi.key", help="Path to VehAPI bearer token file")
|
||||||
parser.add_argument("--no-response-cache", action="store_true", help="Disable request cache stored in snapshot.sqlite")
|
parser.add_argument("--no-response-cache", action="store_true", help="Disable request cache stored in snapshot.sqlite")
|
||||||
@@ -477,7 +475,7 @@ def main(argv: Sequence[str]) -> int:
|
|||||||
allowed_makes=allowed_makes,
|
allowed_makes=allowed_makes,
|
||||||
snapshot_path=snapshot_path,
|
snapshot_path=snapshot_path,
|
||||||
responses_cache=not args.no_response_cache,
|
responses_cache=not args.no_response_cache,
|
||||||
rate_per_sec=args.rate_per_sec,
|
rate_per_min=args.rate_per_min,
|
||||||
)
|
)
|
||||||
|
|
||||||
started_at = datetime.now(timezone.utc)
|
started_at = datetime.now(timezone.utc)
|
||||||
|
|||||||
@@ -37,6 +37,11 @@ const DocumentsMobileScreen = lazy(() => import('./features/documents/mobile/Doc
|
|||||||
const AdminUsersPage = lazy(() => import('./pages/admin/AdminUsersPage').then(m => ({ default: m.AdminUsersPage })));
|
const AdminUsersPage = lazy(() => import('./pages/admin/AdminUsersPage').then(m => ({ default: m.AdminUsersPage })));
|
||||||
const AdminCatalogPage = lazy(() => import('./pages/admin/AdminCatalogPage').then(m => ({ default: m.AdminCatalogPage })));
|
const AdminCatalogPage = lazy(() => import('./pages/admin/AdminCatalogPage').then(m => ({ default: m.AdminCatalogPage })));
|
||||||
const AdminStationsPage = lazy(() => import('./pages/admin/AdminStationsPage').then(m => ({ default: m.AdminStationsPage })));
|
const AdminStationsPage = lazy(() => import('./pages/admin/AdminStationsPage').then(m => ({ default: m.AdminStationsPage })));
|
||||||
|
|
||||||
|
// Admin mobile screens (lazy-loaded)
|
||||||
|
const AdminUsersMobileScreen = lazy(() => import('./features/admin/mobile/AdminUsersMobileScreen').then(m => ({ default: m.AdminUsersMobileScreen })));
|
||||||
|
const AdminCatalogMobileScreen = lazy(() => import('./features/admin/mobile/AdminCatalogMobileScreen').then(m => ({ default: m.AdminCatalogMobileScreen })));
|
||||||
|
const AdminStationsMobileScreen = lazy(() => import('./features/admin/mobile/AdminStationsMobileScreen').then(m => ({ default: m.AdminStationsMobileScreen })));
|
||||||
import { HomePage } from './pages/HomePage';
|
import { HomePage } from './pages/HomePage';
|
||||||
import { BottomNavigation, NavigationItem } from './shared-minimal/components/mobile/BottomNavigation';
|
import { BottomNavigation, NavigationItem } from './shared-minimal/components/mobile/BottomNavigation';
|
||||||
import { GlassCard } from './shared-minimal/components/mobile/GlassCard';
|
import { GlassCard } from './shared-minimal/components/mobile/GlassCard';
|
||||||
@@ -604,6 +609,81 @@ function App() {
|
|||||||
</MobileErrorBoundary>
|
</MobileErrorBoundary>
|
||||||
</motion.div>
|
</motion.div>
|
||||||
)}
|
)}
|
||||||
|
{activeScreen === "AdminUsers" && (
|
||||||
|
<motion.div
|
||||||
|
key="admin-users"
|
||||||
|
initial={{opacity:0, y:8}}
|
||||||
|
animate={{opacity:1, y:0}}
|
||||||
|
exit={{opacity:0, y:-8}}
|
||||||
|
transition={{ duration: 0.2, ease: "easeOut" }}
|
||||||
|
>
|
||||||
|
<MobileErrorBoundary screenName="AdminUsers">
|
||||||
|
<React.Suspense fallback={
|
||||||
|
<div className="space-y-4">
|
||||||
|
<GlassCard>
|
||||||
|
<div className="p-4">
|
||||||
|
<div className="text-slate-500 py-6 text-center">
|
||||||
|
Loading admin users...
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</GlassCard>
|
||||||
|
</div>
|
||||||
|
}>
|
||||||
|
<AdminUsersMobileScreen />
|
||||||
|
</React.Suspense>
|
||||||
|
</MobileErrorBoundary>
|
||||||
|
</motion.div>
|
||||||
|
)}
|
||||||
|
{activeScreen === "AdminCatalog" && (
|
||||||
|
<motion.div
|
||||||
|
key="admin-catalog"
|
||||||
|
initial={{opacity:0, y:8}}
|
||||||
|
animate={{opacity:1, y:0}}
|
||||||
|
exit={{opacity:0, y:-8}}
|
||||||
|
transition={{ duration: 0.2, ease: "easeOut" }}
|
||||||
|
>
|
||||||
|
<MobileErrorBoundary screenName="AdminCatalog">
|
||||||
|
<React.Suspense fallback={
|
||||||
|
<div className="space-y-4">
|
||||||
|
<GlassCard>
|
||||||
|
<div className="p-4">
|
||||||
|
<div className="text-slate-500 py-6 text-center">
|
||||||
|
Loading vehicle catalog...
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</GlassCard>
|
||||||
|
</div>
|
||||||
|
}>
|
||||||
|
<AdminCatalogMobileScreen />
|
||||||
|
</React.Suspense>
|
||||||
|
</MobileErrorBoundary>
|
||||||
|
</motion.div>
|
||||||
|
)}
|
||||||
|
{activeScreen === "AdminStations" && (
|
||||||
|
<motion.div
|
||||||
|
key="admin-stations"
|
||||||
|
initial={{opacity:0, y:8}}
|
||||||
|
animate={{opacity:1, y:0}}
|
||||||
|
exit={{opacity:0, y:-8}}
|
||||||
|
transition={{ duration: 0.2, ease: "easeOut" }}
|
||||||
|
>
|
||||||
|
<MobileErrorBoundary screenName="AdminStations">
|
||||||
|
<React.Suspense fallback={
|
||||||
|
<div className="space-y-4">
|
||||||
|
<GlassCard>
|
||||||
|
<div className="p-4">
|
||||||
|
<div className="text-slate-500 py-6 text-center">
|
||||||
|
Loading station management...
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</GlassCard>
|
||||||
|
</div>
|
||||||
|
}>
|
||||||
|
<AdminStationsMobileScreen />
|
||||||
|
</React.Suspense>
|
||||||
|
</MobileErrorBoundary>
|
||||||
|
</motion.div>
|
||||||
|
)}
|
||||||
</AnimatePresence>
|
</AnimatePresence>
|
||||||
<DebugInfo />
|
<DebugInfo />
|
||||||
</Layout>
|
</Layout>
|
||||||
|
|||||||
@@ -2,7 +2,7 @@ import { create } from 'zustand';
|
|||||||
import { persist, createJSONStorage } from 'zustand/middleware';
|
import { persist, createJSONStorage } from 'zustand/middleware';
|
||||||
import { safeStorage } from '../utils/safe-storage';
|
import { safeStorage } from '../utils/safe-storage';
|
||||||
|
|
||||||
export type MobileScreen = 'Dashboard' | 'Vehicles' | 'Log Fuel' | 'Stations' | 'Documents' | 'Settings';
|
export type MobileScreen = 'Dashboard' | 'Vehicles' | 'Log Fuel' | 'Stations' | 'Documents' | 'Settings' | 'AdminUsers' | 'AdminCatalog' | 'AdminStations';
|
||||||
export type VehicleSubScreen = 'list' | 'detail' | 'add' | 'edit';
|
export type VehicleSubScreen = 'list' | 'detail' | 'add' | 'edit';
|
||||||
|
|
||||||
interface NavigationHistory {
|
interface NavigationHistory {
|
||||||
|
|||||||
@@ -27,6 +27,10 @@ import {
|
|||||||
StationOverview,
|
StationOverview,
|
||||||
CreateStationRequest,
|
CreateStationRequest,
|
||||||
UpdateStationRequest,
|
UpdateStationRequest,
|
||||||
|
CatalogSearchResponse,
|
||||||
|
ImportPreviewResult,
|
||||||
|
ImportApplyResult,
|
||||||
|
CascadeDeleteResult,
|
||||||
} from '../types/admin.types';
|
} from '../types/admin.types';
|
||||||
|
|
||||||
export interface AuditLogsResponse {
|
export interface AuditLogsResponse {
|
||||||
@@ -194,4 +198,73 @@ export const adminApi = {
|
|||||||
deleteStation: async (id: string): Promise<void> => {
|
deleteStation: async (id: string): Promise<void> => {
|
||||||
await apiClient.delete(`/admin/stations/${id}`);
|
await apiClient.delete(`/admin/stations/${id}`);
|
||||||
},
|
},
|
||||||
|
|
||||||
|
// Catalog Search
|
||||||
|
searchCatalog: async (
|
||||||
|
query: string,
|
||||||
|
page: number = 1,
|
||||||
|
pageSize: number = 50
|
||||||
|
): Promise<CatalogSearchResponse> => {
|
||||||
|
const response = await apiClient.get<CatalogSearchResponse>('/admin/catalog/search', {
|
||||||
|
params: { q: query, page, pageSize },
|
||||||
|
});
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
|
||||||
|
// Catalog Import/Export
|
||||||
|
importPreview: async (file: File): Promise<ImportPreviewResult> => {
|
||||||
|
const formData = new FormData();
|
||||||
|
formData.append('file', file);
|
||||||
|
const response = await apiClient.post<ImportPreviewResult>(
|
||||||
|
'/admin/catalog/import/preview',
|
||||||
|
formData,
|
||||||
|
{
|
||||||
|
headers: { 'Content-Type': 'multipart/form-data' },
|
||||||
|
}
|
||||||
|
);
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
|
||||||
|
importApply: async (previewId: string): Promise<ImportApplyResult> => {
|
||||||
|
const response = await apiClient.post<ImportApplyResult>('/admin/catalog/import/apply', {
|
||||||
|
previewId,
|
||||||
|
});
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
|
||||||
|
exportCatalog: async (): Promise<Blob> => {
|
||||||
|
const response = await apiClient.get('/admin/catalog/export', {
|
||||||
|
responseType: 'blob',
|
||||||
|
});
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
|
||||||
|
// Cascade Delete
|
||||||
|
deleteMakeCascade: async (id: string): Promise<CascadeDeleteResult> => {
|
||||||
|
const response = await apiClient.delete<CascadeDeleteResult>(
|
||||||
|
`/admin/catalog/makes/${id}/cascade`
|
||||||
|
);
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
|
||||||
|
deleteModelCascade: async (id: string): Promise<CascadeDeleteResult> => {
|
||||||
|
const response = await apiClient.delete<CascadeDeleteResult>(
|
||||||
|
`/admin/catalog/models/${id}/cascade`
|
||||||
|
);
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
|
||||||
|
deleteYearCascade: async (id: string): Promise<CascadeDeleteResult> => {
|
||||||
|
const response = await apiClient.delete<CascadeDeleteResult>(
|
||||||
|
`/admin/catalog/years/${id}/cascade`
|
||||||
|
);
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
|
||||||
|
deleteTrimCascade: async (id: string): Promise<CascadeDeleteResult> => {
|
||||||
|
const response = await apiClient.delete<CascadeDeleteResult>(
|
||||||
|
`/admin/catalog/trims/${id}/cascade`
|
||||||
|
);
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -316,3 +316,130 @@ export const useDeleteEngine = () => {
|
|||||||
},
|
},
|
||||||
});
|
});
|
||||||
};
|
};
|
||||||
|
|
||||||
|
// Catalog Search
|
||||||
|
export const useCatalogSearch = (query: string, page: number = 1, pageSize: number = 50) => {
|
||||||
|
const { isAuthenticated, isLoading } = useAuth0();
|
||||||
|
|
||||||
|
return useQuery({
|
||||||
|
queryKey: ['catalogSearch', query, page, pageSize],
|
||||||
|
queryFn: () => adminApi.searchCatalog(query, page, pageSize),
|
||||||
|
enabled: isAuthenticated && !isLoading && query.length > 0,
|
||||||
|
staleTime: 30 * 1000, // 30 seconds - search results can change
|
||||||
|
gcTime: 5 * 60 * 1000,
|
||||||
|
retry: 1,
|
||||||
|
refetchOnWindowFocus: false,
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
// Import/Export
|
||||||
|
export const useImportPreview = () => {
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: (file: File) => adminApi.importPreview(file),
|
||||||
|
onError: (error: ApiError) => {
|
||||||
|
toast.error(error.response?.data?.error || 'Failed to preview import');
|
||||||
|
},
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
export const useImportApply = () => {
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: (previewId: string) => adminApi.importApply(previewId),
|
||||||
|
onSuccess: (result) => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['catalogSearch'] });
|
||||||
|
toast.success(
|
||||||
|
`Import completed: ${result.created} created, ${result.updated} updated, ${result.deleted} deleted`
|
||||||
|
);
|
||||||
|
},
|
||||||
|
onError: (error: ApiError) => {
|
||||||
|
toast.error(error.response?.data?.error || 'Failed to apply import');
|
||||||
|
},
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
export const useExportCatalog = () => {
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: () => adminApi.exportCatalog(),
|
||||||
|
onSuccess: (blob) => {
|
||||||
|
const url = window.URL.createObjectURL(blob);
|
||||||
|
const link = document.createElement('a');
|
||||||
|
link.href = url;
|
||||||
|
link.download = 'vehicle-catalog.csv';
|
||||||
|
document.body.appendChild(link);
|
||||||
|
link.click();
|
||||||
|
document.body.removeChild(link);
|
||||||
|
window.URL.revokeObjectURL(url);
|
||||||
|
toast.success('Catalog exported successfully');
|
||||||
|
},
|
||||||
|
onError: (error: ApiError) => {
|
||||||
|
toast.error(error.response?.data?.error || 'Failed to export catalog');
|
||||||
|
},
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
// Cascade Delete
|
||||||
|
export const useDeleteMakeCascade = () => {
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: (id: string) => adminApi.deleteMakeCascade(id),
|
||||||
|
onSuccess: (result) => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['catalogMakes'] });
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['catalogSearch'] });
|
||||||
|
toast.success(`Deleted ${result.totalDeleted} items (cascade)`);
|
||||||
|
},
|
||||||
|
onError: (error: ApiError) => {
|
||||||
|
toast.error(error.response?.data?.error || 'Failed to cascade delete make');
|
||||||
|
},
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
export const useDeleteModelCascade = () => {
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: (id: string) => adminApi.deleteModelCascade(id),
|
||||||
|
onSuccess: (result) => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['catalogModels'] });
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['catalogSearch'] });
|
||||||
|
toast.success(`Deleted ${result.totalDeleted} items (cascade)`);
|
||||||
|
},
|
||||||
|
onError: (error: ApiError) => {
|
||||||
|
toast.error(error.response?.data?.error || 'Failed to cascade delete model');
|
||||||
|
},
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
export const useDeleteYearCascade = () => {
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: (id: string) => adminApi.deleteYearCascade(id),
|
||||||
|
onSuccess: (result) => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['catalogYears'] });
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['catalogSearch'] });
|
||||||
|
toast.success(`Deleted ${result.totalDeleted} items (cascade)`);
|
||||||
|
},
|
||||||
|
onError: (error: ApiError) => {
|
||||||
|
toast.error(error.response?.data?.error || 'Failed to cascade delete year');
|
||||||
|
},
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
export const useDeleteTrimCascade = () => {
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: (id: string) => adminApi.deleteTrimCascade(id),
|
||||||
|
onSuccess: (result) => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['catalogTrims'] });
|
||||||
|
queryClient.invalidateQueries({ queryKey: ['catalogSearch'] });
|
||||||
|
toast.success(`Deleted ${result.totalDeleted} items (cascade)`);
|
||||||
|
},
|
||||||
|
onError: (error: ApiError) => {
|
||||||
|
toast.error(error.response?.data?.error || 'Failed to cascade delete trim');
|
||||||
|
},
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -158,3 +158,65 @@ export interface AdminAccessResponse {
|
|||||||
isAdmin: boolean;
|
isAdmin: boolean;
|
||||||
adminRecord: AdminUser | null;
|
adminRecord: AdminUser | null;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Catalog search types
|
||||||
|
export interface CatalogSearchResult {
|
||||||
|
id: number;
|
||||||
|
year: number;
|
||||||
|
make: string;
|
||||||
|
model: string;
|
||||||
|
trim: string;
|
||||||
|
engineId: number | null;
|
||||||
|
engineName: string | null;
|
||||||
|
transmissionId: number | null;
|
||||||
|
transmissionType: string | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface CatalogSearchResponse {
|
||||||
|
items: CatalogSearchResult[];
|
||||||
|
total: number;
|
||||||
|
page: number;
|
||||||
|
pageSize: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Catalog import types
|
||||||
|
export interface ImportRow {
|
||||||
|
action: 'add' | 'update' | 'delete';
|
||||||
|
year: number;
|
||||||
|
make: string;
|
||||||
|
model: string;
|
||||||
|
trim: string;
|
||||||
|
engineName: string | null;
|
||||||
|
transmissionType: string | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ImportError {
|
||||||
|
row: number;
|
||||||
|
error: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ImportPreviewResult {
|
||||||
|
previewId: string;
|
||||||
|
toCreate: ImportRow[];
|
||||||
|
toUpdate: ImportRow[];
|
||||||
|
toDelete: ImportRow[];
|
||||||
|
errors: ImportError[];
|
||||||
|
valid: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ImportApplyResult {
|
||||||
|
created: number;
|
||||||
|
updated: number;
|
||||||
|
deleted: number;
|
||||||
|
errors: ImportError[];
|
||||||
|
}
|
||||||
|
|
||||||
|
// Cascade delete result
|
||||||
|
export interface CascadeDeleteResult {
|
||||||
|
deletedMakes: number;
|
||||||
|
deletedModels: number;
|
||||||
|
deletedYears: number;
|
||||||
|
deletedTrims: number;
|
||||||
|
deletedEngines: number;
|
||||||
|
totalDeleted: number;
|
||||||
|
}
|
||||||
|
|||||||
@@ -1,10 +1,10 @@
|
|||||||
import React, { useState } from 'react';
|
import React, { useState } from 'react';
|
||||||
import { useAuth0 } from '@auth0/auth0-react';
|
import { useAuth0 } from '@auth0/auth0-react';
|
||||||
import { useNavigate } from 'react-router-dom';
|
|
||||||
import { GlassCard } from '../../../shared-minimal/components/mobile/GlassCard';
|
import { GlassCard } from '../../../shared-minimal/components/mobile/GlassCard';
|
||||||
import { MobileContainer } from '../../../shared-minimal/components/mobile/MobileContainer';
|
import { MobileContainer } from '../../../shared-minimal/components/mobile/MobileContainer';
|
||||||
import { useSettings } from '../hooks/useSettings';
|
import { useSettings } from '../hooks/useSettings';
|
||||||
import { useAdminAccess } from '../../../core/auth/useAdminAccess';
|
import { useAdminAccess } from '../../../core/auth/useAdminAccess';
|
||||||
|
import { useNavigationStore } from '../../../core/store';
|
||||||
|
|
||||||
interface ToggleSwitchProps {
|
interface ToggleSwitchProps {
|
||||||
enabled: boolean;
|
enabled: boolean;
|
||||||
@@ -71,7 +71,7 @@ const Modal: React.FC<ModalProps> = ({ isOpen, onClose, title, children }) => {
|
|||||||
|
|
||||||
export const MobileSettingsScreen: React.FC = () => {
|
export const MobileSettingsScreen: React.FC = () => {
|
||||||
const { user, logout } = useAuth0();
|
const { user, logout } = useAuth0();
|
||||||
const navigate = useNavigate();
|
const { navigateToScreen } = useNavigationStore();
|
||||||
const { settings, updateSetting, isLoading, error } = useSettings();
|
const { settings, updateSetting, isLoading, error } = useSettings();
|
||||||
const { isAdmin, loading: adminLoading } = useAdminAccess();
|
const { isAdmin, loading: adminLoading } = useAdminAccess();
|
||||||
const [showDataExport, setShowDataExport] = useState(false);
|
const [showDataExport, setShowDataExport] = useState(false);
|
||||||
@@ -258,7 +258,7 @@ export const MobileSettingsScreen: React.FC = () => {
|
|||||||
<h2 className="text-lg font-semibold text-blue-600 mb-4">Admin Console</h2>
|
<h2 className="text-lg font-semibold text-blue-600 mb-4">Admin Console</h2>
|
||||||
<div className="space-y-3">
|
<div className="space-y-3">
|
||||||
<button
|
<button
|
||||||
onClick={() => navigate('/garage/settings/admin/users')}
|
onClick={() => navigateToScreen('AdminUsers')}
|
||||||
className="w-full text-left p-4 bg-blue-50 text-blue-700 rounded-lg font-medium hover:bg-blue-100 transition-colors active:bg-blue-200"
|
className="w-full text-left p-4 bg-blue-50 text-blue-700 rounded-lg font-medium hover:bg-blue-100 transition-colors active:bg-blue-200"
|
||||||
style={{ minHeight: '44px' }}
|
style={{ minHeight: '44px' }}
|
||||||
>
|
>
|
||||||
@@ -266,7 +266,7 @@ export const MobileSettingsScreen: React.FC = () => {
|
|||||||
<div className="text-sm text-blue-600 mt-1">Manage admin users and permissions</div>
|
<div className="text-sm text-blue-600 mt-1">Manage admin users and permissions</div>
|
||||||
</button>
|
</button>
|
||||||
<button
|
<button
|
||||||
onClick={() => navigate('/garage/settings/admin/catalog')}
|
onClick={() => navigateToScreen('AdminCatalog')}
|
||||||
className="w-full text-left p-4 bg-blue-50 text-blue-700 rounded-lg font-medium hover:bg-blue-100 transition-colors active:bg-blue-200"
|
className="w-full text-left p-4 bg-blue-50 text-blue-700 rounded-lg font-medium hover:bg-blue-100 transition-colors active:bg-blue-200"
|
||||||
style={{ minHeight: '44px' }}
|
style={{ minHeight: '44px' }}
|
||||||
>
|
>
|
||||||
@@ -274,7 +274,7 @@ export const MobileSettingsScreen: React.FC = () => {
|
|||||||
<div className="text-sm text-blue-600 mt-1">Manage makes, models, and engines</div>
|
<div className="text-sm text-blue-600 mt-1">Manage makes, models, and engines</div>
|
||||||
</button>
|
</button>
|
||||||
<button
|
<button
|
||||||
onClick={() => navigate('/garage/settings/admin/stations')}
|
onClick={() => navigateToScreen('AdminStations')}
|
||||||
className="w-full text-left p-4 bg-blue-50 text-blue-700 rounded-lg font-medium hover:bg-blue-100 transition-colors active:bg-blue-200"
|
className="w-full text-left p-4 bg-blue-50 text-blue-700 rounded-lg font-medium hover:bg-blue-100 transition-colors active:bg-blue-200"
|
||||||
style={{ minHeight: '44px' }}
|
style={{ minHeight: '44px' }}
|
||||||
>
|
>
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
Reference in New Issue
Block a user