Skip to content

test: enable encryption on nextjs-turbopack workbench#1262

Open
TooTallNate wants to merge 7 commits intomainfrom
nate/enable-encryption-nextjs-turbopack
Open

test: enable encryption on nextjs-turbopack workbench#1262
TooTallNate wants to merge 7 commits intomainfrom
nate/enable-encryption-nextjs-turbopack

Conversation

@TooTallNate
Copy link
Member

@TooTallNate TooTallNate commented Mar 4, 2026

Summary

Enables e2e encryption on the nextjs-turbopack workbench project by setting VERCEL_DEPLOYMENT_KEY in vercel.json.

Purpose

This triggers the e2e test suite to run with encryption enabled, which validates:

  • New workflow runs produce encrypted event data
  • Encrypted data is correctly decrypted during replay
  • Pre-encryption runs (from before this deployment) continue to work (backwards compatibility)
  • CLI --decrypt flag works against real encrypted data
  • Web UI Decrypt button works against real encrypted data

What happens

Once this PR's preview deployment is live:

  1. The e2e tests will automatically run against it (Vercel Production env)
  2. New workflow runs triggered by the tests will have encrypted payloads
  3. The test suite should pass — any failures indicate encryption regressions

Cleanup

This key should be removed after the encryption bugbash is complete.

Set VERCEL_DEPLOYMENT_KEY to enable e2e encryption for the
nextjs-turbopack workbench project. This allows the e2e test
suite to exercise encrypted workflow runs alongside existing
unencrypted runs for backwards compatibility testing.
@TooTallNate TooTallNate requested a review from a team as a code owner March 4, 2026 21:47
Copilot AI review requested due to automatic review settings March 4, 2026 21:47
@vercel
Copy link
Contributor

vercel bot commented Mar 4, 2026

@changeset-bot
Copy link

changeset-bot bot commented Mar 4, 2026

🦋 Changeset detected

Latest commit: 2353fbc

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 15 packages
Name Type
@workflow/world-vercel Patch
@workflow/cli Patch
@workflow/core Patch
workflow Patch
@workflow/world-testing Patch
@workflow/builders Patch
@workflow/next Patch
@workflow/nitro Patch
@workflow/web-shared Patch
@workflow/astro Patch
@workflow/nest Patch
@workflow/rollup Patch
@workflow/sveltekit Patch
@workflow/vite Patch
@workflow/nuxt Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@github-actions
Copy link
Contributor

github-actions bot commented Mar 4, 2026

🧪 E2E Test Results

Some tests failed

Summary

Passed Failed Skipped Total
✅ ▲ Vercel Production 538 0 67 605
✅ 💻 Local Development 576 0 84 660
✅ 📦 Local Production 576 0 84 660
✅ 🐘 Local Postgres 576 0 84 660
✅ 🪟 Windows 52 0 3 55
❌ 🌍 Community Worlds 116 49 15 180
✅ 📋 Other 138 0 27 165
Total 2572 49 364 2985

❌ Failed Tests

🌍 Community Worlds (49 failed)

mongodb (1 failed):

  • webhookWorkflow

turso (48 failed):

  • addTenWorkflow
  • addTenWorkflow
  • wellKnownAgentWorkflow (.well-known/agent)
  • should work with react rendering in step
  • promiseAllWorkflow
  • promiseRaceWorkflow
  • promiseAnyWorkflow
  • importedStepOnlyWorkflow
  • hookWorkflow
  • webhookWorkflow
  • sleepingWorkflow
  • parallelSleepWorkflow
  • nullByteWorkflow
  • workflowAndStepMetadataWorkflow
  • fetchWorkflow
  • promiseRaceStressTestWorkflow
  • error handling error propagation workflow errors nested function calls preserve message and stack trace
  • error handling error propagation workflow errors cross-file imports preserve message and stack trace
  • error handling error propagation step errors basic step error preserves message and stack trace
  • error handling error propagation step errors cross-file step error preserves message and function names in stack
  • error handling retry behavior regular Error retries until success
  • error handling retry behavior FatalError fails immediately without retries
  • error handling retry behavior RetryableError respects custom retryAfter delay
  • error handling retry behavior maxRetries=0 disables retries
  • error handling retry behavior workflow completes despite transient 5xx on step_completed
  • error handling catchability FatalError can be caught and detected with FatalError.is()
  • hookCleanupTestWorkflow - hook token reuse after workflow completion
  • concurrent hook token conflict - two workflows cannot use the same hook token simultaneously
  • hookDisposeTestWorkflow - hook token reuse after explicit disposal while workflow still running
  • stepFunctionPassingWorkflow - step function references can be passed as arguments (without closure vars)
  • stepFunctionWithClosureWorkflow - step function with closure variables passed as argument
  • closureVariableWorkflow - nested step functions with closure variables
  • spawnWorkflowFromStepWorkflow - spawning a child workflow using start() inside a step
  • health check (queue-based) - workflow and step endpoints respond to health check messages
  • pathsAliasWorkflow - TypeScript path aliases resolve correctly
  • Calculator.calculate - static workflow method using static step methods from another class
  • AllInOneService.processNumber - static workflow method using sibling static step methods
  • ChainableService.processWithThis - static step methods using this to reference the class
  • thisSerializationWorkflow - step function invoked with .call() and .apply()
  • customSerializationWorkflow - custom class serialization with WORKFLOW_SERIALIZE/WORKFLOW_DESERIALIZE
  • instanceMethodStepWorkflow - instance methods with "use step" directive
  • crossContextSerdeWorkflow - classes defined in step code are deserializable in workflow context
  • stepFunctionAsStartArgWorkflow - step function reference passed as start() argument
  • cancelRun - cancelling a running workflow
  • cancelRun via CLI - cancelling a running workflow
  • pages router addTenWorkflow via pages router
  • pages router promiseAllWorkflow via pages router
  • pages router sleepingWorkflow via pages router

Details by Category

✅ ▲ Vercel Production
App Passed Failed Skipped
✅ astro 48 0 7
✅ example 48 0 7
✅ express 48 0 7
✅ fastify 48 0 7
✅ hono 48 0 7
✅ nextjs-turbopack 53 0 2
✅ nextjs-webpack 53 0 2
✅ nitro 48 0 7
✅ nuxt 48 0 7
✅ sveltekit 48 0 7
✅ vite 48 0 7
✅ 💻 Local Development
App Passed Failed Skipped
✅ astro-stable 46 0 9
✅ express-stable 46 0 9
✅ fastify-stable 46 0 9
✅ hono-stable 46 0 9
✅ nextjs-turbopack-canary 52 0 3
✅ nextjs-turbopack-stable 52 0 3
✅ nextjs-webpack-canary 52 0 3
✅ nextjs-webpack-stable 52 0 3
✅ nitro-stable 46 0 9
✅ nuxt-stable 46 0 9
✅ sveltekit-stable 46 0 9
✅ vite-stable 46 0 9
✅ 📦 Local Production
App Passed Failed Skipped
✅ astro-stable 46 0 9
✅ express-stable 46 0 9
✅ fastify-stable 46 0 9
✅ hono-stable 46 0 9
✅ nextjs-turbopack-canary 52 0 3
✅ nextjs-turbopack-stable 52 0 3
✅ nextjs-webpack-canary 52 0 3
✅ nextjs-webpack-stable 52 0 3
✅ nitro-stable 46 0 9
✅ nuxt-stable 46 0 9
✅ sveltekit-stable 46 0 9
✅ vite-stable 46 0 9
✅ 🐘 Local Postgres
App Passed Failed Skipped
✅ astro-stable 46 0 9
✅ express-stable 46 0 9
✅ fastify-stable 46 0 9
✅ hono-stable 46 0 9
✅ nextjs-turbopack-canary 52 0 3
✅ nextjs-turbopack-stable 52 0 3
✅ nextjs-webpack-canary 52 0 3
✅ nextjs-webpack-stable 52 0 3
✅ nitro-stable 46 0 9
✅ nuxt-stable 46 0 9
✅ sveltekit-stable 46 0 9
✅ vite-stable 46 0 9
✅ 🪟 Windows
App Passed Failed Skipped
✅ nextjs-turbopack 52 0 3
❌ 🌍 Community Worlds
App Passed Failed Skipped
✅ mongodb-dev 3 0 2
❌ mongodb 51 1 3
✅ redis-dev 3 0 2
✅ redis 52 0 3
✅ turso-dev 3 0 2
❌ turso 4 48 3
✅ 📋 Other
App Passed Failed Skipped
✅ e2e-local-dev-nest-stable 46 0 9
✅ e2e-local-postgres-nest-stable 46 0 9
✅ e2e-local-prod-nest-stable 46 0 9

📋 View full workflow run

@github-actions
Copy link
Contributor

github-actions bot commented Mar 4, 2026

📊 Benchmark Results

📈 Comparing against baseline from main branch. Green 🟢 = faster, Red 🔺 = slower.

workflow with no steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Nitro 0.032s (-16.8% 🟢) 1.005s (~) 0.973s 10 1.00x
💻 Local Express 0.035s (+8.6% 🔺) 1.005s (~) 0.970s 10 1.10x
💻 Local Next.js (Turbopack) 0.041s 1.005s 0.964s 10 1.26x
🌐 Redis Next.js (Turbopack) 0.043s 1.005s 0.962s 10 1.34x
🐘 Postgres Next.js (Turbopack) 0.049s 1.011s 0.962s 10 1.52x
🐘 Postgres Nitro 0.052s (~) 1.010s (~) 0.958s 10 1.61x
🐘 Postgres Express 0.056s (+11.1% 🔺) 1.012s (~) 0.956s 10 1.74x
🌐 MongoDB Next.js (Turbopack) 0.112s 1.008s 0.896s 10 3.46x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 0.412s (-9.2% 🟢) 2.058s (-5.2% 🟢) 1.646s 10 1.00x
▲ Vercel Express 0.474s (+23.7% 🔺) 1.985s (+16.1% 🔺) 1.511s 10 1.15x
▲ Vercel Next.js (Turbopack) 0.588s (+19.4% 🔺) 2.088s (-2.6%) 1.500s 10 1.43x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

workflow with 1 step

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Nitro 1.102s (~) 2.005s (~) 0.903s 10 1.00x
💻 Local Next.js (Turbopack) 1.103s 2.005s 0.902s 10 1.00x
🌐 Redis Next.js (Turbopack) 1.104s 2.007s 0.903s 10 1.00x
💻 Local Express 1.111s (+0.8%) 2.006s (~) 0.895s 10 1.01x
🐘 Postgres Nitro 1.125s (+0.5%) 2.018s (~) 0.894s 10 1.02x
🐘 Postgres Next.js (Turbopack) 1.132s 2.013s 0.881s 10 1.03x
🐘 Postgres Express 1.134s (+3.8%) 2.013s (~) 0.879s 10 1.03x
🌐 MongoDB Next.js (Turbopack) 1.300s 2.008s 0.708s 10 1.18x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 1.970s (-1.9%) 3.132s (-13.7% 🟢) 1.162s 10 1.00x
▲ Vercel Express 2.031s (+4.1%) 3.476s (+17.0% 🔺) 1.445s 10 1.03x
▲ Vercel Next.js (Turbopack) 2.055s (-1.9%) 3.352s (~) 1.298s 10 1.04x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

workflow with 10 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Next.js (Turbopack) 10.644s 11.021s 0.377s 3 1.00x
🌐 Redis Next.js (Turbopack) 10.672s 11.023s 0.351s 3 1.00x
💻 Local Nitro 10.758s (~) 11.022s (~) 0.264s 3 1.01x
🐘 Postgres Next.js (Turbopack) 10.785s 11.043s 0.258s 3 1.01x
💻 Local Express 10.820s (~) 11.023s (~) 0.203s 3 1.02x
🐘 Postgres Nitro 10.823s (~) 11.039s (~) 0.215s 3 1.02x
🐘 Postgres Express 10.855s (+2.4%) 11.041s (~) 0.186s 3 1.02x
🌐 MongoDB Next.js (Turbopack) 12.213s 13.019s 0.806s 3 1.15x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 17.255s (-2.9%) 18.206s (-3.3%) 0.951s 2 1.00x
▲ Vercel Express 17.767s (+4.9%) 19.655s (+10.9% 🔺) 1.888s 2 1.03x
▲ Vercel Next.js (Turbopack) 324.292s (+1810.9% 🔺) 325.932s (+1654.9% 🔺) 1.640s 1 18.79x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

workflow with 25 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 26.568s 27.050s 0.481s 3 1.00x
🐘 Postgres Next.js (Turbopack) 26.807s 27.062s 0.256s 3 1.01x
🐘 Postgres Nitro 26.860s (-0.7%) 27.060s (-2.4%) 0.200s 3 1.01x
💻 Local Next.js (Turbopack) 26.919s 27.050s 0.131s 3 1.01x
🐘 Postgres Express 27.034s (+2.5%) 27.393s (+1.2%) 0.359s 3 1.02x
💻 Local Nitro 27.169s (~) 28.052s (~) 0.883s 3 1.02x
💻 Local Express 27.286s (~) 28.051s (~) 0.765s 3 1.03x
🌐 MongoDB Next.js (Turbopack) 30.328s 31.029s 0.701s 2 1.14x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 44.182s (+2.1%) 45.375s (+2.2%) 1.193s 2 1.00x
▲ Vercel Next.js (Turbopack) 44.305s (-1.2%) 45.251s (-2.6%) 0.946s 2 1.00x
▲ Vercel Nitro 45.406s (~) 46.879s (~) 1.473s 2 1.03x

🔍 Observability: Express | Next.js (Turbopack) | Nitro

workflow with 50 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 53.133s 53.597s 0.465s 2 1.00x
🐘 Postgres Next.js (Turbopack) 53.538s 54.097s 0.559s 2 1.01x
🐘 Postgres Nitro 53.825s (~) 54.097s (~) 0.271s 2 1.01x
🐘 Postgres Express 53.921s (+2.3%) 54.105s (+1.9%) 0.184s 2 1.01x
💻 Local Next.js (Turbopack) 55.339s 56.095s 0.757s 2 1.04x
💻 Local Nitro 56.027s (~) 56.100s (~) 0.073s 2 1.05x
💻 Local Express 56.422s (~) 57.103s (~) 0.681s 2 1.06x
🌐 MongoDB Next.js (Turbopack) 60.698s 61.069s 0.371s 2 1.14x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 92.540s (-3.1%) 94.201s (-3.0%) 1.661s 1 1.00x
▲ Vercel Next.js (Turbopack) 95.564s (+3.7%) 97.175s (+4.1%) 1.611s 1 1.03x
▲ Vercel Express 96.347s (-1.6%) 98.291s (-0.9%) 1.944s 1 1.04x

🔍 Observability: Nitro | Next.js (Turbopack) | Express

Promise.all with 10 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 1.245s 2.007s 0.762s 15 1.00x
🐘 Postgres Nitro 1.351s (~) 2.010s (~) 0.659s 15 1.08x
🐘 Postgres Express 1.367s (+6.3% 🔺) 2.011s (~) 0.644s 15 1.10x
🐘 Postgres Next.js (Turbopack) 1.380s 2.012s 0.632s 15 1.11x
💻 Local Nitro 1.403s (-1.3%) 2.005s (~) 0.601s 15 1.13x
💻 Local Next.js (Turbopack) 1.432s 2.005s 0.573s 15 1.15x
💻 Local Express 1.434s (~) 2.005s (~) 0.571s 15 1.15x
🌐 MongoDB Next.js (Turbopack) 2.148s 3.008s 0.860s 10 1.72x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.072s (-16.5% 🟢) 3.179s (-16.4% 🟢) 1.107s 10 1.00x
▲ Vercel Express 2.535s (+16.4% 🔺) 3.782s (+20.3% 🔺) 1.247s 8 1.22x
▲ Vercel Next.js (Turbopack) 2.573s (+9.0% 🔺) 3.555s (~) 0.982s 10 1.24x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

Promise.all with 25 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 2.025s (+12.8% 🔺) 2.597s (+20.6% 🔺) 0.572s 12 1.00x
🐘 Postgres Next.js (Turbopack) 2.093s 2.597s 0.504s 12 1.03x
🐘 Postgres Nitro 2.162s (+3.1%) 2.746s (+5.6% 🔺) 0.584s 11 1.07x
💻 Local Next.js (Turbopack) 2.496s 3.007s 0.511s 10 1.23x
🌐 Redis Next.js (Turbopack) 2.548s 3.008s 0.460s 10 1.26x
💻 Local Express 2.597s (-2.1%) 3.008s (~) 0.411s 10 1.28x
💻 Local Nitro 2.612s (~) 3.007s (~) 0.395s 10 1.29x
🌐 MongoDB Next.js (Turbopack) 4.599s 5.175s 0.577s 6 2.27x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.494s (-16.0% 🟢) 3.619s (-15.9% 🟢) 1.124s 9 1.00x
▲ Vercel Express 2.682s (+8.9% 🔺) 4.125s (+26.0% 🔺) 1.443s 8 1.08x
▲ Vercel Next.js (Turbopack) 2.704s (+5.5% 🔺) 3.570s (-0.6%) 0.865s 9 1.08x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

Promise.all with 50 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Next.js (Turbopack) 3.694s 4.450s 0.756s 7 1.00x
🐘 Postgres Express 3.803s (+28.1% 🔺) 4.451s (+10.8% 🔺) 0.648s 7 1.03x
🐘 Postgres Nitro 3.983s (+1.8%) 4.740s (+3.0%) 0.757s 7 1.08x
🌐 Redis Next.js (Turbopack) 4.353s 5.012s 0.659s 6 1.18x
💻 Local Next.js (Turbopack) 6.667s 7.516s 0.849s 4 1.80x
💻 Local Express 7.522s (-1.5%) 8.019s (~) 0.497s 4 2.04x
💻 Local Nitro 7.556s (-1.1%) 8.021s (~) 0.465s 4 2.05x
🌐 MongoDB Next.js (Turbopack) 9.985s 10.347s 0.362s 3 2.70x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 2.815s (-5.1% 🟢) 3.916s (-4.4%) 1.101s 8 1.00x
▲ Vercel Nitro 3.172s (+3.2%) 4.210s (-3.1%) 1.038s 8 1.13x
▲ Vercel Next.js (Turbopack) 3.816s (+27.0% 🔺) 4.873s (+12.3% 🔺) 1.057s 7 1.36x

🔍 Observability: Express | Nitro | Next.js (Turbopack)

Promise.race with 10 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 1.235s 2.007s 0.772s 15 1.00x
🐘 Postgres Nitro 1.374s (-1.0%) 2.012s (~) 0.638s 15 1.11x
🐘 Postgres Next.js (Turbopack) 1.377s 2.011s 0.635s 15 1.12x
🐘 Postgres Express 1.379s (+6.5% 🔺) 2.011s (~) 0.632s 15 1.12x
💻 Local Next.js (Turbopack) 1.392s 2.004s 0.613s 15 1.13x
💻 Local Nitro 1.428s (~) 2.006s (~) 0.578s 15 1.16x
💻 Local Express 1.454s (+1.3%) 2.005s (~) 0.552s 15 1.18x
🌐 MongoDB Next.js (Turbopack) 2.185s 3.007s 0.822s 10 1.77x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 2.067s (-10.2% 🟢) 3.119s (-15.5% 🟢) 1.052s 10 1.00x
▲ Vercel Express 2.080s (-1.1%) 3.585s (+9.3% 🔺) 1.505s 9 1.01x
▲ Vercel Nitro 2.237s (+4.5%) 3.568s (+4.6%) 1.331s 9 1.08x

🔍 Observability: Next.js (Turbopack) | Express | Nitro

Promise.race with 25 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 1.976s (-4.3%) 2.515s (-8.2% 🟢) 0.539s 12 1.00x
🐘 Postgres Express 1.990s (+16.5% 🔺) 2.598s (+20.6% 🔺) 0.609s 12 1.01x
🐘 Postgres Next.js (Turbopack) 2.015s 2.513s 0.498s 12 1.02x
🌐 Redis Next.js (Turbopack) 2.533s 3.008s 0.476s 10 1.28x
💻 Local Next.js (Turbopack) 2.575s 3.009s 0.434s 10 1.30x
💻 Local Express 2.715s (-2.0%) 3.008s (~) 0.293s 10 1.37x
💻 Local Nitro 2.760s (+1.6%) 3.008s (~) 0.248s 10 1.40x
🌐 MongoDB Next.js (Turbopack) 4.793s 5.177s 0.384s 6 2.42x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 2.386s (-13.9% 🟢) 3.554s (-3.8%) 1.168s 9 1.00x
▲ Vercel Next.js (Turbopack) 2.509s (-6.9% 🟢) 3.388s (-6.3% 🟢) 0.879s 9 1.05x
▲ Vercel Nitro 3.112s (+28.8% 🔺) 4.216s (+15.7% 🔺) 1.104s 8 1.30x

🔍 Observability: Express | Next.js (Turbopack) | Nitro

Promise.race with 50 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 3.589s (+29.6% 🔺) 4.274s (+19.7% 🔺) 0.684s 8 1.00x
🐘 Postgres Next.js (Turbopack) 3.779s 4.593s 0.814s 7 1.05x
🐘 Postgres Nitro 3.927s (+13.9% 🔺) 4.460s (+3.5%) 0.533s 7 1.09x
🌐 Redis Next.js (Turbopack) 4.205s 5.012s 0.807s 6 1.17x
💻 Local Next.js (Turbopack) 7.981s 8.516s 0.535s 4 2.22x
💻 Local Express 8.094s (-3.9%) 9.022s (~) 0.928s 4 2.25x
💻 Local Nitro 8.152s (-1.0%) 8.773s (-2.8%) 0.620s 4 2.27x
🌐 MongoDB Next.js (Turbopack) 10.049s 10.350s 0.300s 3 2.80x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 2.742s (-21.3% 🟢) 4.323s (-13.6% 🟢) 1.581s 7 1.00x
▲ Vercel Express 3.019s (+5.3% 🔺) 4.603s (+14.8% 🔺) 1.583s 7 1.10x
▲ Vercel Nitro 3.096s (+10.5% 🔺) 4.491s (+3.1%) 1.395s 7 1.13x

🔍 Observability: Next.js (Turbopack) | Express | Nitro

Stream Benchmarks (includes TTFB metrics)
workflow with stream

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Next.js (Turbopack) 0.137s 1.002s 0.011s 1.017s 0.880s 10 1.00x
🌐 Redis Next.js (Turbopack) 0.152s 1.000s 0.002s 1.008s 0.856s 10 1.11x
💻 Local Nitro 0.168s (-1.5%) 1.003s (~) 0.011s (-0.9%) 1.017s (~) 0.849s 10 1.23x
🐘 Postgres Next.js (Turbopack) 0.177s 1.001s 0.002s 1.014s 0.837s 10 1.29x
💻 Local Express 0.178s (+3.7%) 1.003s (~) 0.011s (-1.7%) 1.018s (~) 0.840s 10 1.30x
🐘 Postgres Nitro 0.184s (~) 0.992s (~) 0.001s (-14.3% 🟢) 1.012s (~) 0.828s 10 1.34x
🐘 Postgres Express 0.192s (+41.9% 🔺) 0.994s (-0.6%) 0.001s (~) 1.015s (~) 0.823s 10 1.40x
🌐 MongoDB Next.js (Turbopack) 0.489s 0.953s 0.002s 1.009s 0.520s 10 3.57x

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 1.442s (-6.1% 🟢) 2.389s (+4.3%) 0.150s (-6.4% 🟢) 2.968s (+2.4%) 1.526s 10 1.00x
▲ Vercel Express 1.682s (+12.8% 🔺) 2.678s (+9.5% 🔺) 0.119s (-20.2% 🟢) 3.274s (+11.5% 🔺) 1.592s 10 1.17x
▲ Vercel Next.js (Turbopack) 1.865s (+16.5% 🔺) 2.656s (+7.4% 🔺) 0.133s (-72.4% 🟢) 3.213s (-5.9% 🟢) 1.348s 10 1.29x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

Summary

Fastest Framework by World

Winner determined by most benchmark wins

World 🥇 Fastest Framework Wins
💻 Local Next.js (Turbopack) 9/12
🐘 Postgres Next.js (Turbopack) 6/12
▲ Vercel Nitro 7/12
Fastest World by Framework

Winner determined by most benchmark wins

Framework 🥇 Fastest World Wins
Express 🐘 Postgres 6/12
Next.js (Turbopack) 💻 Local 4/12
Nitro 🐘 Postgres 6/12
Column Definitions
  • Workflow Time: Runtime reported by workflow (completedAt - createdAt) - primary metric
  • TTFB: Time to First Byte - time from workflow start until first stream byte received (stream benchmarks only)
  • Slurp: Time from first byte to complete stream consumption (stream benchmarks only)
  • Wall Time: Total testbench time (trigger workflow + poll for result)
  • Overhead: Testbench overhead (Wall Time - Workflow Time)
  • Samples: Number of benchmark iterations run
  • vs Fastest: How much slower compared to the fastest configuration for this benchmark

Worlds:

  • 💻 Local: In-memory filesystem world (local development)
  • 🐘 Postgres: PostgreSQL database world (local development)
  • ▲ Vercel: Vercel production/preview deployment
  • 🌐 Turso: Community world (local development)
  • 🌐 MongoDB: Community world (local development)
  • 🌐 Redis: Community world (local development)
  • 🌐 Jazz: Community world (local development)

📋 View full workflow run

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Enables e2e encryption for the nextjs-turbopack workbench deployment by configuring VERCEL_DEPLOYMENT_KEY in the Vercel project config so the test suite exercises encrypted payloads and decryption paths.

Changes:

  • Adds VERCEL_DEPLOYMENT_KEY to the workbench’s vercel.json environment configuration.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

You can also share your feedback on Copilot code review. Take the survey.

"env": {
"WORKFLOW_PUBLIC_MANIFEST": "1"
"WORKFLOW_PUBLIC_MANIFEST": "1",
"VERCEL_DEPLOYMENT_KEY": "6r4/aqeYUXPsPyLozM4OjSiS7N88aeuWYLFMAKiDE94="
Copy link

Copilot AI Mar 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

VERCEL_DEPLOYMENT_KEY looks like a sensitive deployment secret (used to derive encryption keys) but is being committed as a literal value in vercel.json. This exposes the key in git history and to anyone with repo access, and it will apply to every future deployment of this workbench. Please remove the hard-coded value and source it from Vercel-managed environment variables/secrets instead (e.g., configure it in the Vercel project/preview env, or reference a Vercel secret rather than inlining). Also rotate the key since it’s now been published in the PR diff.

Suggested change
"VERCEL_DEPLOYMENT_KEY": "6r4/aqeYUXPsPyLozM4OjSiS7N88aeuWYLFMAKiDE94="
"VERCEL_DEPLOYMENT_KEY": "@vercel-deployment-key"

Copilot uses AI. Check for mistakes.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a false positive. The workbench/nextjs-turbopack project is a private test workbench used for e2e testing — it's not a production application. The key is intentionally committed as a test fixture so that CI deployments of this workbench exercise the encryption code paths. There's no security risk here since this key only protects test data in a test environment.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's temporary dude. Read the PR description.

The e2e runner calls fetchRunKey for every run, which can exceed the
API rate limit (30/min per team) when many tests run in parallel.
Add retry logic with exponential backoff + jitter (up to 5 retries)
and respect the Retry-After header when present.
…eout(r, NaN)` to resolve immediately (0ms), triggering rapid-fire retries against a rate-limited API endpoint.

This commit fixes the issue reported at packages/world-vercel/src/encryption.ts:138

**Bug Analysis:**

In `packages/world-vercel/src/encryption.ts`, the `fetchRunKey` function handles HTTP 429 (rate limited) responses by reading the `retry-after` header and computing a delay. The original code was:

```js
const retryAfter = response.headers.get('retry-after');
const delay = retryAfter
  ? Number.parseInt(retryAfter, 10) * 1000
  : BASE_DELAY_MS * 2 ** attempt + Math.random() * 500;
```

Per RFC 9110, the `Retry-After` header can contain either a number of seconds OR an HTTP date string (e.g., `"Wed, 21 Oct 2015 07:28:00 GMT"`). When the header contains a date string or any non-numeric value, `Number.parseInt(retryAfter, 10)` returns `NaN`. Since the header string is truthy (non-empty), the ternary takes the parseInt path rather than the exponential backoff fallback. `NaN * 1000` evaluates to `NaN`, and `setTimeout(r, NaN)` resolves immediately (~0ms, confirmed by testing). This causes rapid-fire retries (up to 5 iterations with zero delay) against a rate-limited API endpoint, wasting resources and potentially worsening the rate-limiting situation.

**Fix:**

Parse the integer first, then check if the result is NaN before using it. If NaN, fall back to exponential backoff with jitter. This matches the established pattern already used in `packages/world-vercel/src/utils.ts` (lines 309-315) where the same header is properly guarded with `!Number.isNaN(parsed)`.

```js
const retryAfterHeader = response.headers.get('retry-after');
const parsedRetryAfter = retryAfterHeader
  ? Number.parseInt(retryAfterHeader, 10)
  : NaN;
const delay = !Number.isNaN(parsedRetryAfter)
  ? parsedRetryAfter * 1000
  : BASE_DELAY_MS * 2 ** attempt + Math.random() * 500;
```

Co-authored-by: Vercel <vercel[bot]@users.noreply.github.com>
Co-authored-by: TooTallNate <n@n8.io>
With encryption enabled on the nextjs-turbopack workbench, CLI inspect
output shows encrypted data as placeholders. The e2e tests check for
actual output values, so they need --decrypt to see the real data.
The shared RetryAgent from getDispatcher() already handles 429/5xx
retries with exponential backoff and Retry-After header support.
The manual retry loop in fetchRunKey was redundant.

Also add changeset for the VERCEL=1 external context fix.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants