Effective: 30 March 2026 — GENŌ Intelligentia Limited
GENŌ Intelligentia Limited was founded on the belief that artificial intelligence should serve people — not harm them. We build AI products and systems for the real world, and we take full responsibility for what we build, who uses it, and how it is used.
This document makes our ethical commitments explicit and unconditional. It is not a marketing statement. It is a binding declaration of what this company stands for and what it will never do, regardless of commercial pressure or opportunity.
These commitments apply to every product we ship, every service we provide, every line of code we write, and every partnership we enter into.
GENŌ Intelligentia will never design, develop, supply, or knowingly contribute to any system whose intended or primary function is to harm, injure, or kill human beings or animals. This includes but is not limited to:
This commitment is absolute. It applies regardless of the identity of the requesting party, the stated justification, the size of the contract, or the jurisdiction in which the system would be deployed.
GENŌ Intelligentia does not support war, armed conflict, or political violence in any form. We will not supply products, services, or technical expertise to any party — state, non-state, or private — for the purpose of conducting, facilitating, or escalating armed conflict.
We recognise that AI and data systems can be misused to enable violence even when not explicitly designed for it. We therefore take active steps to understand the end use of our technology and reserve the right to refuse, terminate, or revoke access to any party whose use we determine to be in breach of this commitment.
This includes refusing to build systems that:
Every product GENŌ Intelligentia builds is designed with a single overriding purpose: to advance human capability, safety, knowledge, or wellbeing. We build tools that help people work smarter, stay safer, understand their world better, and make more informed decisions.
VisionGuard exists to protect retailers and their staff from theft — not to build surveillance states. NewsTrac exists to help people understand global events with greater clarity — not to enable propaganda or manipulation. Every future product will be held to the same standard.
Before shipping any product or feature, we ask: does this make someone's life meaningfully better? If the honest answer is no, we do not ship it.
We design our products with privacy as a foundational principle, not an afterthought. This means:
We will be honest about what our products can and cannot do. We will not overstate capability, fabricate evidence of performance, or make claims we cannot substantiate. We will be transparent about limitations, failure modes, and appropriate use cases.
When our products make mistakes — and all AI systems do — we will acknowledge them, investigate them, and address them. We will not hide errors behind complexity or blame users for misuse that we could have anticipated.
GENŌ Intelligentia products and services must not be used in any way that is detrimental to individuals, communities, or society. This includes but is not limited to:
Breach of this principle constitutes grounds for immediate termination of access to any GENŌ Intelligentia product or service, without refund and without notice where urgency requires it.
These commitments are not aspirational. They are the operating standard of this company. GENŌ Intelligentia Limited, as a company, is fully accountable for ensuring these principles are upheld in every decision we make.
If you believe a GENŌ Intelligentia product is being used in violation of these principles, or if you have concerns about a product we are developing, we want to hear from you.
We may update this Ethics & Values statement as our products evolve and as new ethical questions emerge in the AI landscape. Any material change will be reflected in the effective date at the top of this page. Our core commitments — no weapons, no war, no harm — will never be weakened by any update.
For our terms governing use of our products and services, see our Terms of Use.