Student Builder Challenge

Build something real on AIEP. The builder challenge is open to university students and recent graduates who want to contribute to open knowledge infrastructure and produce a portfolio piece that demonstrates genuine systems thinking.

There are no entry fees, no institutional barriers, and no requirement to be enrolled in a specific programme. The only requirement is a working submission.


Challenge categories

Choose one category. Submissions must be functional — a live demo, a deployed endpoint, or a runnable repository. Documentation and a short write-up explaining your design decisions are required for all categories.

Category A — Mirror toolkits

Build a publishing toolkit that makes it easy for a non-technical user to deploy an AIEP Mirror. Target at least one common platform (WordPress, Obsidian, Hugo, Next.js, plain HTML/static). A good submission produces a conformant /.well-known/aiep/index.json, a valid metadata.json, and at least one additional artefact type. Bonus: automated hash generation on publish.

Category B — Validators and linters

Build a tool that checks an AIEP artefact or Mirror against the canonical schemas and reports conformance with precise, actionable error messages. A good submission covers schema validation, hash verification, and registry resolution. Bonus: a CLI and a web interface from the same codebase.

Category C — Retrieval demonstrations

Build a demonstration that shows the measurable difference between retrieving from an AIEP-conformant source and an unconformed source. What can you verify from one that you cannot from the other? A good submission shows the gap clearly and proposes what a retrieval agent should do differently in each case.

Category D — Vertical applications

Take AIEP into a specific domain — legal records, medical evidence, academic citations, financial audit, construction instructions — and build the domain-specific schema extensions, claim-type definitions, and registry entries it would require. A good submission includes at least one new schema, a sample artefact set, and a brief on the domain-specific trust requirements.

Category E — Recall engine

Implement a working simulation of the P22 deterministic context reconstruction engine. Given an archived set of AIEP artefacts and a RecallScope definition, reconstruct the historical context state and produce a ContextReconstructionHash. Demonstrate bit-identical output across two independent runs. A good submission includes a test suite that validates determinism.


Evaluation criteria

All submissions are evaluated on the same four criteria:

CriterionWeightWhat we look for
Correctness40%Does it work? Does it conform to the protocol? Are schemas valid, hashes correct, registry resolution accurate?
Design clarity25%Is the architecture sensible? Are the design decisions explained and defensible? Would someone else be able to extend it?
Genuine usefulness25%Does it solve a real problem in the AIEP ecosystem? Would a builder or adopter actually use it?
Documentation10%Is the submission self-explanatory? Does the write-up explain what was built, why, and what remains to do?

Submissions that chase complexity over correctness score poorly. A simple tool that works completely and is well-documented will outscore a sophisticated tool that is partially broken.


Submission process

  1. Repository — publish your work in a public GitHub repository under an open-source licence (Apache 2.0 preferred)
  2. Write-up — include a SUBMISSION.md in the root of the repository covering: what you built, which category, key design decisions, what works, what does not, and what you would do next
  3. Demo — include either a live URL, a recorded walkthrough (video or GIF), or clear instructions to run it locally in under five minutes
  4. Submit — email [email protected] with the subject line AIEP Builder Challenge Submission, your repository URL, and your institution (if applicable)

Submissions are accepted on a rolling basis. There is no closing date. Strong submissions will receive a response within four weeks.


What you gain

Feedback — every complete submission receives written feedback on correctness, design, and what would need to change for it to be production-ready.

Attribution — accepted submissions are listed on the AIEP Hub with your name, institution, and a link to your repository. Your work becomes part of the public record of the AIEP ecosystem.

Collaboration pathway — submissions that address a genuine gap in the ecosystem may be invited into a formal collaboration, with terms agreed in writing. There is no obligation on either side.

Portfolio substance — a working AIEP implementation demonstrates protocol-level systems thinking, schema design, cryptographic verification, and machine-readable publishing — skills that are rare and in demand.


Commercial terms

For student and educational submissions, engagement is free and there is no revenue expectation. If a submission evolves into a commercial product or service via an institution, participation terms can be structured to benefit the builder and the institution — subject to formal agreement.

No commercial arrangement is implied or guaranteed by participation in the challenge.


See: Research & Academia · Builders · Protocol · Schema Catalogue