Project Overview We are building a Next.js + Node.js + Vercel analytics SaaS for affiliates. Users connect affiliate networks via API or CSV upload, we normalise performance stats, and we provide dashboards, GEO heatmaps, anonymised and aggregated network benchmarks, and alerts.
The core challenge (and priority) is building a clean, reliable data ingestion and aggregation system that scales as more users and networks connect.
What You’ll Work On You will lead the backend/data engineering work for ingestion and analytics, including:
- Design the data model in Postgres for raw imports, normalised metrics, and aggregated reporting - Build API integrations (auth where needed, scheduled pulls, pagination, rate limits) - Build a CSV import pipeline (validation, mapping, deduplication, error handling, audit trail) - Implement background jobs for ingestion and processing (cron plus queues/workers) with retries, backoff, and idempotency - Build the aggregation layer for dashboards/heatmaps and anonymised benchmarks (privacy-safe rules for small sample sizes) - Expose clean API endpoints for the Next.js app (performance focused) - Add logging/monitoring and basic automated tests for pipeline reliability - Ensure sensible handling of sensitive data (secrets, access tokens, and user isolation)
Current Stack / Preferences
- Next.js (App Router) + Node.js runtime - Vercel deployment - Postgres (hosted provider is flexible) - ORM: Prisma or Drizzle (open to your recommendation) - Background jobs: cron + queue/worker approach (open to your recommendation)
Required Experience (Must Have)
- Strong Node.js/TypeScript backend experience (production SaaS preferred) - Deep Postgres skills (schema design, indexing, query optimisation, migrations) - Real background job experience (queues/workers, cron scheduling, retries, idempotency) - Proven experience integrating third-party APIs and handling messy/partial data - Ability to design systems that are clean, organised, and maintainable
Nice to Have
- Experience deploying Node/Next.js systems on Vercel or serverless environments - Experience building analytics/aggregation systems (materialized views, rollups, caching strategies) - Familiarity with privacy-safe aggregation (minimum sample thresholds, anonymisation rules) - Experience with affiliate platforms, iGaming, or performance marketing analytics - Observability tooling (Sentry, OpenTelemetry, structured logging)
Engagement
- Contract role (remote) - Start with an initial scope focused on ingestion + aggregation MVP, with potential for ongoing work - Please confirm you are comfortable with the milestone-based budget and timeline below - Deliverables are defined by the milestone acceptance criteria below
What Success Looks Like (Deliverables)
- Clear backend architecture for ingestion, processing, and aggregation - Working pipeline for CSV import and at least one API integration (with a pattern to add more) - Normalised metric layer (consistent definitions across sources) - Aggregated tables/endpoints powering dashboards + GEO heatmap - Foundation for anonymised benchmark calculations - Clean code structure, basic tests, and logging
How to Apply Please send:
- A short intro and 1–3 relevant projects you’ve shipped (links if possible) - Your preferred stack for Postgres + jobs (Prisma/Drizzle, cron/queues, ETL approach) - A brief outline of how you would design ingestion + deduplication + retries for API and CSV sources
Screening Questions (Answer briefly)
- Describe a pipeline you built. How did you handle retries, rate limits, and duplicate imports? - What’s your preferred approach for background jobs in a Next.js/Vercel setup? - How would you prevent anonymised benchmarks from leaking data in small GEO/brand sample sizes?
We are optimising for correctness and reliability over flashy UI. The data pipeline is the constraint. Please include one example of a data pipeline you shipped in production and what broke first.
---------------------------------
**See attached PDF for Milestones and detailed project overview**
Budget
- Timeline: Preferably within 3 months (Milestones 1 to 5 delivered on a rolling basis) - Payment: milestone-based, €1,200 per milestone (5 milestones) - Total budget: €6,000
Milestone payments are released as milestones are completed and accepted, not strictly one per month. Some milestones may be delivered in the same month depending on progress.
---------------------------------
Preferred applicants: Senior backend/data engineers with proven production experience in Node.js/TypeScript, Postgres, and background job systems (data pipelines, ETL, ingestion, rollups).
More ongoing work available after this project for the right candidate.
Instagram Brand Video Content Boost Category: Branding, Content Creation, Influencer Marketing, Instagram Marketing, Social Media Management, Social Media Marketing, Video Editing, Video Production, Video Services, Videography Budget: €250 - €750 EUR
27-Jan-2026 10:55 GMT
Figma Specialist — Web App Redesign Category: Figma, Graphic Design, Prototyping, User Interface / IA, UX / User Experience, Visual Design, Web Design Budget: $30 - $250 CAD
Meta API Integration Expert Category: .NET, API Integration, Facebook API, Instagram API, MySQL, Node.js, OAuth, REST API, Server To Server Facebook API Integration, Social Media Management Budget: $750 - $1500 USD
Door Cylinder Page Redesign Category: Adobe XD, Figma, Graphic Design, HTML, UI / User Interface, User Interface / IA, Web Design, WooCommerce, WordPress Budget: €250 - €750 EUR
27-Jan-2026 10:52 GMT
Original Articles on Tissue-Selective Bone Resection Category: Data Analysis, Medical, Medical Research, Medical Writing, Research, Research Writing, Scientific Research, Scientific Writing, Statistical Analysis, Technical Writing Budget: €3000 - €5000 EUR