Case Study: Deploying a Secure Quantum Job Submission Portal Using Local Browsers and Post-Quantum TLS
case-studydeploymentsecurity

Case Study: Deploying a Secure Quantum Job Submission Portal Using Local Browsers and Post-Quantum TLS

UUnknown
2026-02-14
10 min read
Advertisement

A 2026 case study: how a university lab built a QPU submission portal using local browser AI and hybrid post-quantum TLS for secure, auditable job submission.

Hook: Why a secure, practical QPU submission portal matters in 2026

University labs building hybrid quantum-classical research stacks face the same friction: limited access to QPUs, rapidly changing SDKs, and a growing security surface as experiments move from local notebooks to web portals. This case study shows how a mid-sized university physics lab deployed a secure job submission portal that uses local-browser AI for client-side preprocessing and post-quantum TLS (PQTLS) for transport — a practical blueprint you can replicate or adapt in 2026.

Executive summary — what we built and why it matters

The lab’s objectives were straightforward: enable students and researchers to submit quantum jobs to shared QPUs, reduce sensitive data sent to cloud backends, and future-proof communications against quantum-capable adversaries. We implemented a web portal where the browser (running a local WASM-based model) performs pre-submission checks, job canonicalization, and partial noise mitigation suggestions. The server authenticates and authorises users, accepts job bundles over a hybrid PQTLS 1.3 session (Kyber + X25519), and forwards signed job payloads to the QPU provider’s API. Key takeaways: lowered operational risk, preserved privacy, and an auditable, post-quantum-ready chain of custody for experiment data.

  • Post-quantum crypto is now operational in production labs: by late 2025 many TLS gateways started supporting hybrid PQ ciphers (e.g., CRYSTALS-Kyber + X25519).
  • Local-AI in browsers matured: WebAssembly builds of popular LLMs and quantized runtimes (llama.cpp wasm, WebNN-backed runners) made client-side preprocessing feasible on desktops and modern laptops.
  • Quantum-access APIs stabilized: most QPU providers in 2025–2026 offered HTTP/REST job submission endpoints with JSON job manifests and artifact retention policies — suitable for being wrapped by a secure portal.

Threat model and security goals

We assumed adversaries with network capability (passive eavesdropping and active MITM) and future quantum-capable key-recovery as a long-term threat. We did not assume full kernel-level compromise of clients. Goals were:

  • Confidentiality: protect job payloads in transit with hybrid PQTLS.
  • Integrity: sign job manifests with a PQ signature (CRYSTALS-Dilithium) before submission to the QPU provider.
  • Privacy & minimization: run preprocessing locally in the browser to strip or anonymize sensitive classical data (e.g., user identifiers embedded in wavefunction metadata).
  • Auditability: maintain an append-only ledger of submissions and results (using classical immutability guarantees; a later phase considered blockchain anchoring).

High-level architecture

The solution has three logical layers:

  1. Client (Browser): UI, local-AI preprocessor (WASM), PQ signature utilities (WASM liboqs), and WebSocket/HTTP client talking PQTLS to the server gateway.
  2. Server Gateway: TLS-terminating gateway with OpenSSL+liboqs or a managed PQTLS reverse proxy; enforces RBAC, stores job metadata, and signs/forwards canonicalized payloads to QPU provider API.
  3. Backend & QPU Provider: Job scheduler, compute cluster, and provider API (REST). The gateway communicates using provider-specific authentication; long-term provider API keys are stored in an HSM/KMS.

Diagram (textual)

Browser (local WASM LLM + PQ libs) --(hybrid PQTLS 1.3)--> PQTLS Gateway (OpenSSL+liboqs) --> Internal API & HSM --> QPU Provider API

Step-by-step deployment — what the lab actually did

1) Prototype the client-side preprocessor

The team started with a WebAssembly-compiled quantized model (llama.cpp → wasm) to run small token-based transforms in-browser. The preprocessing tasks were:

  • Strip metadata and PII from payloads.
  • Validate gate sets and transform high-level circuits into provider-compatible JSON (using a WASM-compiled transpiler).
  • Provide a human-friendly summary and local explainability for the job.

Example client flow (simplified):

// main.js (browser)
async function prepareAndSubmit(job) {
  // 1. Run local model to canonicalize
  const canonical = await localModel.canonicalize(job);

  // 2. Run lightweight validation
  const ok = await validator.validate(canonical);
  if (!ok) throw new Error('Validation failed');

  // 3. Sign with PQ signature (WASM liboqs)
  const signature = await pqSign(canonical);

  // 4. Send over PQTLS connection
  const resp = await fetch('/submit', {
    method: 'POST',
    headers: { 'Content-Type': 'application/json' },
    body: JSON.stringify({ canonical, signature })
  });
  return resp.json();
}

Implementation notes

  • Use a WebWorker for the WASM model to keep the UI responsive.
  • Run the quantized model at low precision to reduce memory (most student laptops could handle it after quantization).
  • Package WASM assets with accurate Subresource Integrity (SRI) hashes and serve over PQTLS to avoid tampering.

2) Add post-quantum primitives in the browser

Because mainstream WebCrypto did not yet expose PQ algorithms in 2026, the team used a lightweight WASM build of liboqs for KEM and Dilithium for signatures. The key lifecycle was intentionally short: the browser generated ephemeral keypairs for session-level signatures and used a long-lived user identity stored in the OS credential manager to authorize role membership.

// pq-crypto-wasm usage
const pq = await import('/wasm/pq-lib.js');
const { publicKey, privateKey } = pq.dilithium.generateKeypair();
const sig = pq.dilithium.sign(privateKey, canonicalPayload);

3) Deploy PQTLS at the gateway

The gateway was a small VM running a patched OpenSSL build with liboqs integration. The lab evaluated two deployment options:

  • Self-managed OpenSSL + liboqs (gives full control).
  • Managed PQTLS proxy from the institution’s reverse-proxy team (faster compliance, limited crypto control).

The lab chose self-managed for visibility. Key steps:

  1. Build liboqs and the OQS-OpenSSL integration (kept in a reproducible container).
  2. Configure TLS 1.3 with a hybrid KEM: Kyber768 + X25519 and a PQ signature for server certs (CRYSTALS-Dilithium in the cert chain for future verification audits).
  3. Restrict TLS ciphers to the hybrid sets and enable strict TLS 1.3 server-only mode.

Sample OpenSSL startup (containerized):

# Build and run container (simplified)
docker build -t pqtls-gateway:1.0 .
docker run --cap-add=SYS_ADMIN -p 443:443 pqtls-gateway:1.0

nginx config (terminating TLS to OpenSSL):

server {
    listen 443 ssl;
    ssl_certificate /etc/ssl/pq/server-chain.pem; # includes Dilithium leaf
    ssl_certificate_key /etc/ssl/pq/server.key;
    ssl_protocols TLSv1.3;
    ssl_ciphers "TLS_AES_128_GCM_SHA256"; # handshake uses hybrid KEM via OpenSSL configs
    location / { proxy_pass http://app:8080; }
}

4) Job signing and forwarding to QPU provider

When the gateway accepted a job bundle, it performed server-side checks and appended an internal signature chain using its HSM-backed key (HSM stored PQ private keys where possible; otherwise, keys were stored in KMS with strict access controls). The gateway then called the QPU provider’s submission API.

// gateway/app.js (Node.js express simplified)
app.post('/submit', authenticateUser, async (req, res) => {
  const { canonical, signature } = req.body;
  // Verify client PQ signature
  if (!await pq.verify(canonical, signature, req.user.pubKey))
    return res.status(400).send({ error: 'invalid client signature' });

  // Server-side canonicalization and policy checks
  const serverSigned = await hsm.sign(canonical);

  // Forward to provider
  const providerResp = await provider.submitJob({
    payload: canonical,
    clientSignature: signature,
    serverSignature: serverSigned
  });

  // Store minimal metadata and return job id
  await db.jobs.insert({ id: providerResp.id, user: req.user.id, timestamp: Date.now() });
  res.send({ id: providerResp.id });
});

5) Testing: simulator → hardware

The lab validated the full path using a QPU simulator (open-source providers) first, then staged to a real QPU in a controlled batch. Important tests included:

  • Key compromise simulation: rotate keys and ensure replay cannot change prior audit entries.
  • Man-in-the-middle tests: validate PQTLS prevents downgrade to classical-only KEX.
  • Performance profiling: measure TLS handshake time (hybrid PQ handshakes are slightly slower but acceptable).

Deployment notes, ops, and gotchas

Operational checklist

  • Containerize gateway and app for reproducible builds; pin liboqs versions; follow container & micro-edge deployment patterns from the micro-edge field review when operating in constrained environments.
  • Automate key rotation: ephemeral session KEM keys every session; server-side long-lived keys rotated annually and stored in an HSM/KMS.
  • Enable tight logging with redaction of binary payloads; keep job IDs and signatures for audits.
  • Provide a fallback page for legacy browsers explaining why connection is refused (do not downgrade to classical TLS automatically).

Performance and UX trade-offs

Hybrid PQ handshakes added ~20–40ms to cold TLS handshakes on lab hardware in late 2025, and WASM local models added 200–500ms for preprocessing on student laptops. The team optimized by caching session tickets (still using hybrid KEM) and throttling model size based on device capability; these optimizations align with patterns in the Quantum Edge writeups.

Regulatory and compliance

The university’s IT security group required documentation of the PQ algorithms and a risk assessment. We included NIST PQC references and vendor attestations for liboqs builds. Keep your supply chain auditable: store build artifacts and binary checksums in the institution’s artifact registry.

Code snippets: PQ signing (WASM) and Gateway verification

Example WASM glue signing helper (browser):

async function pqSign(payload) {
  const pq = await PQWASM(); // pq exposes dilithium API
  const keypair = pq.generateDilithiumKeypair();
  // store private key in memory only for session
  const sig = pq.dilithiumSign(keypair.privateKey, new TextEncoder().encode(payload));
  return { signature: base64Encode(sig), pubKey: base64Encode(keypair.publicKey) };
}

Gateway-side verification and HSM signing (Node.js pseudo):

const pq = require('pq-client');
app.post('/submit', async (req, res) => {
  const { canonical, signature, pubKey } = req.body;
  if (!pq.dilithiumVerify(pubKey, canonical, signature))
    return res.status(400).send('Invalid signature');

  const serverSig = await hsm.sign('dilithium', canonical);
  // forward to provider
});

Operational story: what went well and lessons learned

  • Client-side preprocessing dramatically reduced sensitive metadata sent to the provider and shortened developer feedback loops.
  • Early investment in PQTLS tooling avoided an expensive re-certification step later. The hybrid approach allowed interoperability with classical clients during phased rollouts.
  • Key management complexity rose — the team invested in HSM-backed KMS and automated rotation to keep complexity manageable.
  • Students appreciated the local explainability from the browser LLM, but some older devices struggled — provide a no-LLM fallback with stricter validation rules.

Advanced strategies and future directions (2026+)

  • Move to client-side WebAuthn with PQ attestation once browser vendors expose PQ primitives natively.
  • Consider multi-party signing for high-value experiments: split signing between the gateway and a vault to reduce insider risk.
  • Explore secure enclaves at the gateway (SGX/Sev) for in-transit payload processing if regulatory requirements demand stronger in-memory protection; micro-edge deployment patterns may inform your enclave placement (micro-edge).
  • Anchor job metadata to an immutable ledger for irrefutable experiment provenance (use a permissioned L1 or institutional notary service).
"Adopting hybrid PQTLS now lets a research group remain both interoperable and forward-secure — and the UX gains from local preprocessing are immediate." — Lead engineer, Quantum Lab (2025)

Checklist: Ready-to-deploy items

  • WASM local-AI artifacts and validators (SRI-verified)
  • liboqs + OQS-OpenSSL build pipeline in CI
  • HSM/KMS integration for server keys
  • PQTLS-enabled reverse proxy or gateway container image
  • Provider API integration tests and simulator suite
  • Operational runbooks for key rotation and incident response

Final recommendations for labs and developers

If you’re evaluating a similar deployment, start with the client-side UX and a simulator loop. Local preprocessing will reveal the common failure modes in job manifests and reduce accidental leakage. Run your PQTLS gateway in a reproducible container environment and treat PQ crypto libraries like any other supply-chain dependency — pin versions and scan builds. Finally, assume you’ll need to rotate long-lived keys and design your system to make rotation painless.

Conclusion & call to action

This university lab’s deployment demonstrates a practical path to secure, future-proof quantum job submission portals: combine local-AI preprocessing in the browser with hybrid post-quantum TLS and PQ signatures for a defensible chain of custody. The approach balances immediate UX improvements with long-term cryptographic resilience — a model any research group or early adopter lab can implement in 2026.

Ready to adapt this design for your lab? Download the lab’s reference repo (WASM artifacts, Dockerfile for PQTLS gateway, and test scripts) from the institutional mirror, or schedule a technical walkthrough with our team to map this architecture to your QPU provider and compliance needs.

Call to action: Get the reference deployment kit and step-by-step CI pipeline. Contact qbit365.co.uk for a tailored lab workshop and a 2-week pilot integration plan.

Advertisement

Related Topics

#case-study#deployment#security
U

Unknown

Contributor

Senior editor and content strategist. Writing about technology, design, and the future of digital media. Follow along for deep dives into the industry's moving parts.

Advertisement
2026-02-16T20:36:49.907Z