add faceid demo

pull/233/head
Vladimir Mandic 2021-11-11 11:30:55 -05:00
parent 6b93191080
commit e85b359095
26 changed files with 1066 additions and 818 deletions

View File

@ -152,11 +152,11 @@
"external": ["*/human.esm.js"] "external": ["*/human.esm.js"]
}, },
{ {
"name": "demo/facerecognition", "name": "demo/faceid",
"platform": "browser", "platform": "browser",
"format": "esm", "format": "esm",
"input": "demo/facerecognition/index.ts", "input": "demo/faceid/index.ts",
"output": "demo/facerecognition/index.js", "output": "demo/faceid/index.js",
"sourcemap": true, "sourcemap": true,
"external": ["*/human.esm.js"] "external": ["*/human.esm.js"]
} }

View File

@ -29,7 +29,7 @@
"assets", "assets",
"demo/helpers/*.js", "demo/helpers/*.js",
"demo/typescript/*.js", "demo/typescript/*.js",
"demo/facerecognition/*.js", "demo/faceid/*.js",
"dist", "dist",
"media", "media",
"models", "models",
@ -49,6 +49,7 @@
"func-names": "off", "func-names": "off",
"guard-for-in": "off", "guard-for-in": "off",
"import/extensions": "off", "import/extensions": "off",
"import/named": "off",
"import/no-extraneous-dependencies": "off", "import/no-extraneous-dependencies": "off",
"import/no-named-as-default": "off", "import/no-named-as-default": "off",
"import/no-unresolved": "off", "import/no-unresolved": "off",

View File

@ -9,8 +9,9 @@
## Changelog ## Changelog
### **HEAD -> main** 2021/11/09 mandic00@live.com ### **HEAD -> main** 2021/11/10 mandic00@live.com
- auto tensor shape and channels handling
- disable use of path2d in node - disable use of path2d in node
- add liveness module and facerecognition demo - add liveness module and facerecognition demo
- initial version of facerecognition demo - initial version of facerecognition demo

View File

@ -49,7 +49,7 @@ JavaScript module using TensorFlow/JS Machine Learning library
- **Full** [[*Live*]](https://vladmandic.github.io/human/demo/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo): Main browser demo app that showcases all Human capabilities - **Full** [[*Live*]](https://vladmandic.github.io/human/demo/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo): Main browser demo app that showcases all Human capabilities
- **Simple** [[*Live*]](https://vladmandic.github.io/human/demo/typescript/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/typescript): Simple demo in WebCam processing demo in TypeScript - **Simple** [[*Live*]](https://vladmandic.github.io/human/demo/typescript/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/typescript): Simple demo in WebCam processing demo in TypeScript
- **Face Match** [[*Live*]](https://vladmandic.github.io/human/demo/facematch/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/facematch): Extract faces from images, calculates face descriptors and simmilarities and matches them to known database - **Face Match** [[*Live*]](https://vladmandic.github.io/human/demo/facematch/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/facematch): Extract faces from images, calculates face descriptors and simmilarities and matches them to known database
- **Face Recognition** [[*Live*]](https://vladmandic.github.io/human/demo/facerecognition/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/facerecognition): Runs multiple checks to validate webcam input before performing face match, similar to *FaceID* - **Face ID** [[*Live*]](https://vladmandic.github.io/human/demo/faceid/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/faceid): Runs multiple checks to validate webcam input before performing face match to faces in IndexDB
- **Multi-thread** [[*Live*]](https://vladmandic.github.io/human/demo/multithread/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/multithread): Runs each `human` module in a separate web worker for highest possible performance - **Multi-thread** [[*Live*]](https://vladmandic.github.io/human/demo/multithread/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/multithread): Runs each `human` module in a separate web worker for highest possible performance
- **Face 3D** [[*Live*]](https://vladmandic.github.io/human/demo/face3d/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/face3d): Uses WebCam as input and draws 3D render of face mesh using `Three.js` - **Face 3D** [[*Live*]](https://vladmandic.github.io/human/demo/face3d/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/face3d): Uses WebCam as input and draws 3D render of face mesh using `Three.js`
- **Virtual Avatar** [[*Live*]](https://vladmandic.github.io/human-vrm/src/human-vrm.html) [[*Details*]](https://github.com/vladmandic/human-vrm): VR model with head, face, eye, body and hand tracking - **Virtual Avatar** [[*Live*]](https://vladmandic.github.io/human-vrm/src/human-vrm.html) [[*Details*]](https://github.com/vladmandic/human-vrm): VR model with head, face, eye, body and hand tracking

View File

@ -47,3 +47,7 @@ New:
- new optional model `liveness` - new optional model `liveness`
checks if input appears to be a real-world live image or a recording checks if input appears to be a real-world live image or a recording
best used together with `antispoofing` that checks if input appears to have a realistic face best used together with `antispoofing` that checks if input appears to have a realistic face
Other:
- Improved **Safari** compatibility
- Documentation overhaul

View File

@ -1,6 +1,7 @@
# Human Face Recognition # Human Face Recognition: FaceID
`facerecognition` runs multiple checks to validate webcam input before performing face match, similar to *FaceID* `faceid` runs multiple checks to validate webcam input before performing face match
Detected face image and descriptor are stored in client-side IndexDB
## Workflow ## Workflow
- Starts webcam - Starts webcam
@ -10,8 +11,8 @@
- Face and gaze direction - Face and gaze direction
- Detection scores - Detection scores
- Blink detection (including temporal check for blink speed) to verify live input - Blink detection (including temporal check for blink speed) to verify live input
- Runs antispoofing optional module - Runs `antispoofing` optional module
- Runs liveness optional module - Runs `liveness` optional module
- Runs match against database of registered faces and presents best match with scores - Runs match against database of registered faces and presents best match with scores
## Notes ## Notes

View File

@ -16,15 +16,24 @@
<style> <style>
@font-face { font-family: 'Lato'; font-display: swap; font-style: normal; font-weight: 100; src: local('Lato'), url('../../assets/lato-light.woff2') } @font-face { font-family: 'Lato'; font-display: swap; font-style: normal; font-weight: 100; src: local('Lato'), url('../../assets/lato-light.woff2') }
html { font-family: 'Lato', 'Segoe UI'; font-size: 16px; font-variant: small-caps; } html { font-family: 'Lato', 'Segoe UI'; font-size: 16px; font-variant: small-caps; }
body { margin: 0; background: black; color: white; overflow-x: hidden; width: 100vw; height: 100vh; } body { margin: 0; padding: 16px; background: black; color: white; overflow-x: hidden; width: 100vw; height: 100vh; }
body::-webkit-scrollbar { display: none; } body::-webkit-scrollbar { display: none; }
.button { padding: 2px; cursor: pointer; box-shadow: 2px 2px black; width: 64px; text-align: center; margin-left: 16px; height: 16px }
</style> </style>
</head> </head>
<body> <body>
<canvas id="canvas" style="margin: 0 auto; width: 100%"></canvas> <canvas id="canvas" style="padding: 8px"></canvas>
<canvas id="source" style="padding: 8px"></canvas>
<video id="video" playsinline style="display: none"></video> <video id="video" playsinline style="display: none"></video>
<pre id="fps" style="position: absolute; top: 12px; right: 20px; background-color: grey; padding: 8px; box-shadow: 2px 2px black"></pre> <pre id="fps" style="position: absolute; top: 12px; right: 20px; background-color: grey; padding: 8px; box-shadow: 2px 2px black"></pre>
<pre id="log" style="padding: 8px"></pre> <pre id="log" style="padding: 8px"></pre>
<div id="match" style="display: none; padding: 8px">
<label for="name">name:</label>
<input id="name" type="text" value="" style="height: 16px; border: none; padding: 2px; margin-left: 8px">
<span id="save" class="button" style="background-color: royalblue">save</span>
<span id="delete" class="button" style="background-color: lightcoral">delete</span>
</div>
<div id="retry" class="button" style="background-color: darkslategray; width: 350px">retry</div>
<div id="status" style="position: absolute; bottom: 0; width: 100%; padding: 8px; font-size: 0.8rem;"></div> <div id="status" style="position: absolute; bottom: 0; width: 100%; padding: 8px; font-size: 0.8rem;"></div>
</body> </body>
</html> </html>

View File

@ -4,8 +4,67 @@
author: <https://github.com/vladmandic>' author: <https://github.com/vladmandic>'
*/ */
// demo/facerecognition/index.ts // demo/faceid/index.ts
import { Human } from "../../dist/human.esm.js"; import { Human } from "../../dist/human.esm.js";
// demo/faceid/indexdb.ts
var db;
var database = "human";
var table = "person";
var log = (...msg) => console.log("indexdb", ...msg);
async function open() {
if (db)
return true;
return new Promise((resolve) => {
const request = indexedDB.open(database, 1);
request.onerror = (evt) => log("error:", evt);
request.onupgradeneeded = (evt) => {
log("create:", evt.target);
db = evt.target.result;
db.createObjectStore(table, { keyPath: "id", autoIncrement: true });
};
request.onsuccess = (evt) => {
db = evt.target.result;
log("open:", db);
resolve(true);
};
});
}
async function load() {
const faceDB = [];
if (!db)
await open();
return new Promise((resolve) => {
const cursor = db.transaction([table], "readwrite").objectStore(table).openCursor(null, "next");
cursor.onerror = (evt) => log("load error:", evt);
cursor.onsuccess = (evt) => {
if (evt.target.result) {
faceDB.push(evt.target.result.value);
evt.target.result.continue();
} else {
resolve(faceDB);
}
};
});
}
async function save(faceRecord) {
if (!db)
await open();
const newRecord = { name: faceRecord.name, descriptor: faceRecord.descriptor, image: faceRecord.image };
db.transaction([table], "readwrite").objectStore(table).put(newRecord);
log("save:", newRecord);
}
async function remove(faceRecord) {
if (!db)
await open();
db.transaction([table], "readwrite").objectStore(table).delete(faceRecord.id);
log("delete:", faceRecord);
}
// demo/faceid/index.ts
var db2 = [];
var face;
var current;
var humanConfig = { var humanConfig = {
modelBasePath: "../../models", modelBasePath: "../../models",
filter: { equalization: true }, filter: { equalization: true },
@ -24,12 +83,12 @@ var humanConfig = {
gesture: { enabled: true } gesture: { enabled: true }
}; };
var options = { var options = {
faceDB: "../facematch/faces.json",
minConfidence: 0.6, minConfidence: 0.6,
minSize: 224, minSize: 224,
maxTime: 1e4, maxTime: 1e4,
blinkMin: 10, blinkMin: 10,
blinkMax: 800 blinkMax: 800,
threshold: 0.5
}; };
var ok = { var ok = {
faceCount: false, faceCount: false,
@ -47,7 +106,6 @@ var blink = {
end: 0, end: 0,
time: 0 time: 0
}; };
var db = [];
var human = new Human(humanConfig); var human = new Human(humanConfig);
human.env["perfadd"] = false; human.env["perfadd"] = false;
human.draw.options.font = 'small-caps 18px "Lato"'; human.draw.options.font = 'small-caps 18px "Lato"';
@ -57,12 +115,18 @@ var dom = {
canvas: document.getElementById("canvas"), canvas: document.getElementById("canvas"),
log: document.getElementById("log"), log: document.getElementById("log"),
fps: document.getElementById("fps"), fps: document.getElementById("fps"),
status: document.getElementById("status") status: document.getElementById("status"),
match: document.getElementById("match"),
name: document.getElementById("name"),
save: document.getElementById("save"),
delete: document.getElementById("delete"),
retry: document.getElementById("retry"),
source: document.getElementById("source")
}; };
var timestamp = { detect: 0, draw: 0 }; var timestamp = { detect: 0, draw: 0 };
var fps = { detect: 0, draw: 0 }; var fps = { detect: 0, draw: 0 };
var startTime = 0; var startTime = 0;
var log = (...msg) => { var log2 = (...msg) => {
dom.log.innerText += msg.join(" ") + "\n"; dom.log.innerText += msg.join(" ") + "\n";
console.log(...msg); console.log(...msg);
}; };
@ -80,7 +144,8 @@ async function webCam() {
await ready; await ready;
dom.canvas.width = dom.video.videoWidth; dom.canvas.width = dom.video.videoWidth;
dom.canvas.height = dom.video.videoHeight; dom.canvas.height = dom.video.videoHeight;
log("video:", dom.video.videoWidth, dom.video.videoHeight, stream.getVideoTracks()[0].label); if (human.env.initial)
log2("video:", dom.video.videoWidth, dom.video.videoHeight, "|", stream.getVideoTracks()[0].label);
dom.canvas.onclick = () => { dom.canvas.onclick = () => {
if (dom.video.paused) if (dom.video.paused)
dom.video.play(); dom.video.play();
@ -90,6 +155,8 @@ async function webCam() {
} }
async function detectionLoop() { async function detectionLoop() {
if (!dom.video.paused) { if (!dom.video.paused) {
if (face && face.tensor)
human.tf.dispose(face.tensor);
await human.detect(dom.video); await human.detect(dom.video);
const now = human.now(); const now = human.now();
fps.detect = 1e3 / (now - timestamp.detect); fps.detect = 1e3 / (now - timestamp.detect);
@ -124,59 +191,109 @@ async function validationLoop() {
printStatus(ok); printStatus(ok);
if (allOk()) { if (allOk()) {
dom.video.pause(); dom.video.pause();
return human.result.face; return human.result.face[0];
} else {
human.tf.dispose(human.result.face[0].tensor);
} }
if (ok.elapsedMs > options.maxTime) { if (ok.elapsedMs > options.maxTime) {
dom.video.pause(); dom.video.pause();
return human.result.face; return human.result.face[0];
} else { } else {
ok.elapsedMs = Math.trunc(human.now() - startTime); ok.elapsedMs = Math.trunc(human.now() - startTime);
return new Promise((resolve) => { return new Promise((resolve) => {
setTimeout(async () => { setTimeout(async () => {
const res = await validationLoop(); const res = await validationLoop();
if (res) if (res)
resolve(human.result.face); resolve(human.result.face[0]);
}, 30); }, 30);
}); });
} }
} }
async function detectFace(face) { async function saveRecords() {
dom.canvas.width = face.tensor.shape[2]; var _a;
dom.canvas.height = face.tensor.shape[1]; if (dom.name.value.length > 0) {
const image = (_a = dom.canvas.getContext("2d")) == null ? void 0 : _a.getImageData(0, 0, dom.canvas.width, dom.canvas.height);
const rec = { id: 0, name: dom.name.value, descriptor: face.embedding, image };
await save(rec);
log2("saved face record:", rec.name);
db2.push(rec);
} else {
log2("invalid name");
}
}
async function deleteRecord() {
if (current.id > 0) {
await remove(current);
}
}
async function detectFace() {
var _a;
if (!face || !face.tensor || !face.embedding)
return 0;
dom.canvas.width = face.tensor.shape[1] || 0;
dom.canvas.height = face.tensor.shape[0] || 0;
dom.source.width = dom.canvas.width;
dom.source.height = dom.canvas.height;
dom.canvas.style.width = ""; dom.canvas.style.width = "";
human.tf.browser.toPixels(face.tensor, dom.canvas); human.tf.browser.toPixels(face.tensor, dom.canvas);
human.tf.dispose(face.tensor); const descriptors = db2.map((rec) => rec.descriptor);
const arr = db.map((rec) => rec.embedding); const res = await human.match(face.embedding, descriptors);
const res = await human.match(face.embedding, arr); dom.match.style.display = "flex";
log(`found best match: ${db[res.index].name} similarity: ${Math.round(1e3 * res.similarity) / 10}% source: ${db[res.index].source}`); dom.retry.style.display = "block";
if (res.index === -1) {
log2("no matches");
dom.delete.style.display = "none";
dom.source.style.display = "none";
} else {
current = db2[res.index];
log2(`best match: ${current.name} | id: ${current.id} | similarity: ${Math.round(1e3 * res.similarity) / 10}%`);
dom.delete.style.display = "";
dom.name.value = current.name;
dom.source.style.display = "";
(_a = dom.source.getContext("2d")) == null ? void 0 : _a.putImageData(current.image, 0, 0);
} }
async function loadFaceDB() { return res.similarity > options.threshold;
const res = await fetch(options.faceDB);
db = res && res.ok ? await res.json() : [];
log("loaded face db:", options.faceDB, "records:", db.length);
} }
async function main() { async function main() {
log("human version:", human.version, "| tfjs version:", human.tf.version_core); ok.faceCount = false;
printFPS("loading..."); ok.faceConfidence = false;
await loadFaceDB(); ok.facingCenter = false;
await human.load(); ok.blinkDetected = false;
printFPS("initializing..."); ok.faceSize = false;
await human.warmup(); ok.antispoofCheck = false;
ok.livenessCheck = false;
ok.elapsedMs = 0;
dom.match.style.display = "none";
dom.retry.style.display = "none";
document.body.style.background = "black";
await webCam(); await webCam();
await detectionLoop(); await detectionLoop();
startTime = human.now(); startTime = human.now();
const face = await validationLoop(); face = await validationLoop();
if (!allOk())
log("did not find valid input", face);
else {
log("found valid face", face);
await detectFace(face[0]);
}
dom.fps.style.display = "none"; dom.fps.style.display = "none";
if (!allOk()) {
log2("did not find valid input", face);
return 0;
} else {
const res = await detectFace();
document.body.style.background = res ? "darkgreen" : "maroon";
return res;
} }
window.onload = main; }
async function init() {
log2("human version:", human.version, "| tfjs version:", human.tf.version_core);
log2("options:", JSON.stringify(options).replace(/{|}|"|\[|\]/g, "").replace(/,/g, " "));
printFPS("loading...");
db2 = await load();
log2("loaded face records:", db2.length);
await webCam();
await human.load();
printFPS("initializing...");
dom.retry.addEventListener("click", main);
dom.save.addEventListener("click", saveRecords);
dom.delete.addEventListener("click", deleteRecord);
await human.warmup();
await main();
}
window.onload = init;
/** /**
* Human demo for browsers * Human demo for browsers
* @default Human Library * @default Human Library

7
demo/faceid/index.js.map Normal file

File diff suppressed because one or more lines are too long

View File

@ -7,14 +7,19 @@
* @license MIT * @license MIT
*/ */
import { Human } from '../../dist/human.esm.js'; // equivalent of @vladmandic/Human import { Human, TensorLike, FaceResult } from '../../dist/human.esm.js'; // equivalent of @vladmandic/Human
import * as indexDb from './indexdb'; // methods to deal with indexdb
let db: Array<indexDb.FaceRecord> = []; // face descriptor database stored in indexdb
let face: FaceResult; // face result from human.detect
let current: indexDb.FaceRecord; // currently matched db record
const humanConfig = { // user configuration for human, used to fine-tune behavior const humanConfig = { // user configuration for human, used to fine-tune behavior
modelBasePath: '../../models', modelBasePath: '../../models',
filter: { equalization: true }, // lets run with histogram equilizer filter: { equalization: true }, // lets run with histogram equilizer
face: { face: {
enabled: true, enabled: true,
detector: { rotation: true, return: true }, // return tensor is not really needed except to draw detected face detector: { rotation: true, return: true }, // return tensor is used to get detected face image
description: { enabled: true }, description: { enabled: true },
iris: { enabled: true }, // needed to determine gaze direction iris: { enabled: true }, // needed to determine gaze direction
emotion: { enabled: false }, // not needed emotion: { enabled: false }, // not needed
@ -24,16 +29,16 @@ const humanConfig = { // user configuration for human, used to fine-tune behavio
body: { enabled: false }, body: { enabled: false },
hand: { enabled: false }, hand: { enabled: false },
object: { enabled: false }, object: { enabled: false },
gesture: { enabled: true }, gesture: { enabled: true }, // parses face and iris gestures
}; };
const options = { const options = {
faceDB: '../facematch/faces.json', minConfidence: 0.6, // overal face confidence for box, face, gender, real, live
minConfidence: 0.6, // overal face confidence for box, face, gender, real
minSize: 224, // min input to face descriptor model before degradation minSize: 224, // min input to face descriptor model before degradation
maxTime: 10000, // max time before giving up maxTime: 10000, // max time before giving up
blinkMin: 10, // minimum duration of a valid blink blinkMin: 10, // minimum duration of a valid blink
blinkMax: 800, // maximum duration of a valid blink blinkMax: 800, // maximum duration of a valid blink
threshold: 0.5, // minimum similarity
}; };
const ok = { // must meet all rules const ok = { // must meet all rules
@ -54,7 +59,7 @@ const blink = { // internal timers for blink start/end/duration
time: 0, time: 0,
}; };
let db: Array<{ name: string, source: string, embedding: number[] }> = []; // holds loaded face descriptor database // let db: Array<{ name: string, source: string, embedding: number[] }> = []; // holds loaded face descriptor database
const human = new Human(humanConfig); // create instance of human with overrides from user configuration const human = new Human(humanConfig); // create instance of human with overrides from user configuration
human.env['perfadd'] = false; // is performance data showing instant or total values human.env['perfadd'] = false; // is performance data showing instant or total values
@ -67,6 +72,12 @@ const dom = { // grab instances of dom objects so we dont have to look them up l
log: document.getElementById('log') as HTMLPreElement, log: document.getElementById('log') as HTMLPreElement,
fps: document.getElementById('fps') as HTMLPreElement, fps: document.getElementById('fps') as HTMLPreElement,
status: document.getElementById('status') as HTMLPreElement, status: document.getElementById('status') as HTMLPreElement,
match: document.getElementById('match') as HTMLDivElement,
name: document.getElementById('name') as HTMLInputElement,
save: document.getElementById('save') as HTMLSpanElement,
delete: document.getElementById('delete') as HTMLSpanElement,
retry: document.getElementById('retry') as HTMLDivElement,
source: document.getElementById('source') as HTMLCanvasElement,
}; };
const timestamp = { detect: 0, draw: 0 }; // holds information used to calculate performance and possible memory leaks const timestamp = { detect: 0, draw: 0 }; // holds information used to calculate performance and possible memory leaks
const fps = { detect: 0, draw: 0 }; // holds calculated fps information for both detect and screen refresh const fps = { detect: 0, draw: 0 }; // holds calculated fps information for both detect and screen refresh
@ -91,7 +102,7 @@ async function webCam() { // initialize webcam
await ready; await ready;
dom.canvas.width = dom.video.videoWidth; dom.canvas.width = dom.video.videoWidth;
dom.canvas.height = dom.video.videoHeight; dom.canvas.height = dom.video.videoHeight;
log('video:', dom.video.videoWidth, dom.video.videoHeight, stream.getVideoTracks()[0].label); if (human.env.initial) log('video:', dom.video.videoWidth, dom.video.videoHeight, '|', stream.getVideoTracks()[0].label);
dom.canvas.onclick = () => { // pause when clicked on screen and resume on next click dom.canvas.onclick = () => { // pause when clicked on screen and resume on next click
if (dom.video.paused) dom.video.play(); if (dom.video.paused) dom.video.play();
else dom.video.pause(); else dom.video.pause();
@ -100,6 +111,7 @@ async function webCam() { // initialize webcam
async function detectionLoop() { // main detection loop async function detectionLoop() { // main detection loop
if (!dom.video.paused) { if (!dom.video.paused) {
if (face && face.tensor) human.tf.dispose(face.tensor); // dispose previous tensor
await human.detect(dom.video); // actual detection; were not capturing output in a local variable as it can also be reached via human.result await human.detect(dom.video); // actual detection; were not capturing output in a local variable as it can also be reached via human.result
const now = human.now(); const now = human.now();
fps.detect = 1000 / (now - timestamp.detect); fps.detect = 1000 / (now - timestamp.detect);
@ -108,7 +120,7 @@ async function detectionLoop() { // main detection loop
} }
} }
async function validationLoop(): Promise<typeof human.result.face> { // main screen refresh loop async function validationLoop(): Promise<FaceResult> { // main screen refresh loop
const interpolated = await human.next(human.result); // smoothen result using last-known results const interpolated = await human.next(human.result); // smoothen result using last-known results
await human.draw.canvas(dom.video, dom.canvas); // draw canvas to screen await human.draw.canvas(dom.video, dom.canvas); // draw canvas to screen
await human.draw.all(dom.canvas, interpolated); // draw labels, boxes, lines, etc. await human.draw.all(dom.canvas, interpolated); // draw labels, boxes, lines, etc.
@ -116,7 +128,6 @@ async function validationLoop(): Promise<typeof human.result.face> { // main scr
fps.draw = 1000 / (now - timestamp.draw); fps.draw = 1000 / (now - timestamp.draw);
timestamp.draw = now; timestamp.draw = now;
printFPS(`fps: ${fps.detect.toFixed(1).padStart(5, ' ')} detect | ${fps.draw.toFixed(1).padStart(5, ' ')} draw`); // write status printFPS(`fps: ${fps.detect.toFixed(1).padStart(5, ' ')} detect | ${fps.draw.toFixed(1).padStart(5, ' ')} draw`); // write status
ok.faceCount = human.result.face.length === 1; // must be exactly detected face ok.faceCount = human.result.face.length === 1; // must be exactly detected face
if (ok.faceCount) { // skip the rest if no face if (ok.faceCount) { // skip the rest if no face
const gestures: string[] = Object.values(human.result.gesture).map((gesture) => gesture.gesture); // flatten all gestures const gestures: string[] = Object.values(human.result.gesture).map((gesture) => gesture.gesture); // flatten all gestures
@ -130,65 +141,113 @@ async function validationLoop(): Promise<typeof human.result.face> { // main scr
ok.livenessCheck = (human.result.face[0].live || 0) > options.minConfidence; ok.livenessCheck = (human.result.face[0].live || 0) > options.minConfidence;
ok.faceSize = human.result.face[0].box[2] >= options.minSize && human.result.face[0].box[3] >= options.minSize; ok.faceSize = human.result.face[0].box[2] >= options.minSize && human.result.face[0].box[3] >= options.minSize;
} }
printStatus(ok); printStatus(ok);
if (allOk()) { // all criteria met if (allOk()) { // all criteria met
dom.video.pause(); dom.video.pause();
return human.result.face; return human.result.face[0];
} else {
human.tf.dispose(human.result.face[0].tensor); // results are not ok, so lets dispose tensor
} }
if (ok.elapsedMs > options.maxTime) { // give up if (ok.elapsedMs > options.maxTime) { // give up
dom.video.pause(); dom.video.pause();
return human.result.face; return human.result.face[0];
} else { // run again } else { // run again
ok.elapsedMs = Math.trunc(human.now() - startTime); ok.elapsedMs = Math.trunc(human.now() - startTime);
return new Promise((resolve) => { return new Promise((resolve) => {
setTimeout(async () => { setTimeout(async () => {
const res = await validationLoop(); // run validation loop until conditions are met const res = await validationLoop(); // run validation loop until conditions are met
if (res) resolve(human.result.face); // recursive promise resolve if (res) resolve(human.result.face[0]); // recursive promise resolve
}, 30); // use to slow down refresh from max refresh rate to target of 30 fps }, 30); // use to slow down refresh from max refresh rate to target of 30 fps
}); });
} }
} }
async function detectFace(face) { async function saveRecords() {
// draw face and dispose face tensor immediatey afterwards if (dom.name.value.length > 0) {
dom.canvas.width = face.tensor.shape[2]; const image = dom.canvas.getContext('2d')?.getImageData(0, 0, dom.canvas.width, dom.canvas.height) as ImageData;
dom.canvas.height = face.tensor.shape[1]; const rec = { id: 0, name: dom.name.value, descriptor: face.embedding as number[], image };
dom.canvas.style.width = ''; await indexDb.save(rec);
human.tf.browser.toPixels(face.tensor, dom.canvas); log('saved face record:', rec.name);
human.tf.dispose(face.tensor); db.push(rec);
} else {
const arr = db.map((rec) => rec.embedding); log('invalid name');
const res = await human.match(face.embedding, arr); }
log(`found best match: ${db[res.index].name} similarity: ${Math.round(1000 * res.similarity) / 10}% source: ${db[res.index].source}`);
} }
async function loadFaceDB() { async function deleteRecord() {
const res = await fetch(options.faceDB); if (current.id > 0) {
db = (res && res.ok) ? await res.json() : []; await indexDb.remove(current);
log('loaded face db:', options.faceDB, 'records:', db.length); }
}
async function detectFace() {
// draw face and dispose face tensor immediatey afterwards
if (!face || !face.tensor || !face.embedding) return 0;
dom.canvas.width = face.tensor.shape[1] || 0;
dom.canvas.height = face.tensor.shape[0] || 0;
dom.source.width = dom.canvas.width;
dom.source.height = dom.canvas.height;
dom.canvas.style.width = '';
human.tf.browser.toPixels(face.tensor as unknown as TensorLike, dom.canvas);
const descriptors = db.map((rec) => rec.descriptor);
const res = await human.match(face.embedding, descriptors);
dom.match.style.display = 'flex';
dom.retry.style.display = 'block';
if (res.index === -1) {
log('no matches');
dom.delete.style.display = 'none';
dom.source.style.display = 'none';
} else {
current = db[res.index];
log(`best match: ${current.name} | id: ${current.id} | similarity: ${Math.round(1000 * res.similarity) / 10}%`);
dom.delete.style.display = '';
dom.name.value = current.name;
dom.source.style.display = '';
dom.source.getContext('2d')?.putImageData(current.image, 0, 0);
}
return res.similarity > options.threshold;
} }
async function main() { // main entry point async function main() { // main entry point
log('human version:', human.version, '| tfjs version:', human.tf.version_core); ok.faceCount = false;
printFPS('loading...'); ok.faceConfidence = false;
await loadFaceDB(); ok.facingCenter = false;
await human.load(); // preload all models ok.blinkDetected = false;
printFPS('initializing...'); ok.faceSize = false;
await human.warmup(); // warmup function to initialize backend for future faster detection ok.antispoofCheck = false;
await webCam(); // start webcam ok.livenessCheck = false;
ok.elapsedMs = 0;
dom.match.style.display = 'none';
dom.retry.style.display = 'none';
document.body.style.background = 'black';
await webCam();
await detectionLoop(); // start detection loop await detectionLoop(); // start detection loop
startTime = human.now(); startTime = human.now();
const face = await validationLoop(); // start validation loop face = await validationLoop(); // start validation loop
if (!allOk()) log('did not find valid input', face);
else {
log('found valid face', face);
await detectFace(face[0]);
}
dom.fps.style.display = 'none'; dom.fps.style.display = 'none';
if (!allOk()) {
log('did not find valid input', face);
return 0;
} else {
// log('found valid face');
const res = await detectFace();
document.body.style.background = res ? 'darkgreen' : 'maroon';
return res;
}
} }
window.onload = main; async function init() {
log('human version:', human.version, '| tfjs version:', human.tf.version_core);
log('options:', JSON.stringify(options).replace(/{|}|"|\[|\]/g, '').replace(/,/g, ' '));
printFPS('loading...');
db = await indexDb.load(); // load face database from indexdb
log('loaded face records:', db.length);
await webCam(); // start webcam
await human.load(); // preload all models
printFPS('initializing...');
dom.retry.addEventListener('click', main);
dom.save.addEventListener('click', saveRecords);
dom.delete.addEventListener('click', deleteRecord);
await human.warmup(); // warmup function to initialize backend for future faster detection
await main();
}
window.onload = init;

57
demo/faceid/indexdb.ts Normal file
View File

@ -0,0 +1,57 @@
let db: IDBDatabase; // instance of indexdb
const database = 'human';
const table = 'person';
export type FaceRecord = { id: number, name: string, descriptor: number[], image: ImageData };
// eslint-disable-next-line no-console
const log = (...msg) => console.log('indexdb', ...msg);
export async function open() {
if (db) return true;
return new Promise((resolve) => {
const request: IDBOpenDBRequest = indexedDB.open(database, 1);
request.onerror = (evt) => log('error:', evt);
request.onupgradeneeded = (evt: IDBVersionChangeEvent) => { // create if doesnt exist
log('create:', evt.target);
db = (evt.target as IDBOpenDBRequest).result;
db.createObjectStore(table, { keyPath: 'id', autoIncrement: true });
};
request.onsuccess = (evt) => { // open
db = (evt.target as IDBOpenDBRequest).result as IDBDatabase;
log('open:', db);
resolve(true);
};
});
}
export async function load(): Promise<FaceRecord[]> {
const faceDB: Array<FaceRecord> = [];
if (!db) await open(); // open or create if not already done
return new Promise((resolve) => {
const cursor: IDBRequest = db.transaction([table], 'readwrite').objectStore(table).openCursor(null, 'next');
cursor.onerror = (evt) => log('load error:', evt);
cursor.onsuccess = (evt) => {
if ((evt.target as IDBRequest).result) {
faceDB.push((evt.target as IDBRequest).result.value);
(evt.target as IDBRequest).result.continue();
} else {
resolve(faceDB);
}
};
});
}
export async function save(faceRecord: FaceRecord) {
if (!db) await open(); // open or create if not already done
const newRecord = { name: faceRecord.name, descriptor: faceRecord.descriptor, image: faceRecord.image }; // omit id as its autoincrement
db.transaction([table], 'readwrite').objectStore(table).put(newRecord);
log('save:', newRecord);
}
export async function remove(faceRecord: FaceRecord) {
if (!db) await open(); // open or create if not already done
db.transaction([table], 'readwrite').objectStore(table).delete(faceRecord.id); // delete based on id
log('delete:', faceRecord);
}

File diff suppressed because one or more lines are too long

2
dist/human.esm.js vendored
View File

@ -70576,7 +70576,7 @@ registerBackend("wasm", async () => {
const { wasm } = await init(); const { wasm } = await init();
return new BackendWasm(wasm); return new BackendWasm(wasm);
}, WASM_PRIORITY); }, WASM_PRIORITY);
var externalVersion = "3.11.0-20211108"; var externalVersion = "3.11.0-20211110";
var version8 = { var version8 = {
tfjs: externalVersion, tfjs: externalVersion,
"tfjs-core": externalVersion, "tfjs-core": externalVersion,

File diff suppressed because one or more lines are too long

2
dist/human.js vendored

File diff suppressed because one or more lines are too long

2
dist/tfjs.esm.js vendored
View File

@ -69864,7 +69864,7 @@ registerBackend("wasm", async () => {
const { wasm } = await init(); const { wasm } = await init();
return new BackendWasm(wasm); return new BackendWasm(wasm);
}, WASM_PRIORITY); }, WASM_PRIORITY);
var externalVersion = "3.11.0-20211108"; var externalVersion = "3.11.0-20211110";
var version8 = { var version8 = {
tfjs: externalVersion, tfjs: externalVersion,
"tfjs-core": externalVersion, "tfjs-core": externalVersion,

View File

@ -32,13 +32,12 @@
"human", "human",
"human-library", "human-library",
"face-detection", "face-detection",
"faceid",
"face-geometry", "face-geometry",
"face-embedding", "face-embedding",
"face-recognition", "face-recognition",
"face-description", "face-description",
"face-matching", "face-matching",
"face-api",
"faceapi",
"body-tracking", "body-tracking",
"body-segmentation", "body-segmentation",
"hand-tracking", "hand-tracking",
@ -49,7 +48,6 @@
"gesture-recognition", "gesture-recognition",
"gaze-tracking", "gaze-tracking",
"age-gender", "age-gender",
"person",
"tensorflowjs", "tensorflowjs",
"tfjs", "tfjs",
"tensorflow" "tensorflow"

View File

@ -4,7 +4,7 @@ import type { env } from './util/env';
export * from './config'; export * from './config';
export * from './result'; export * from './result';
export type { Tensor } from './tfjs/types'; export type { Tensor, TensorLike } from './tfjs/types';
export type { DrawOptions } from './util/draw'; export type { DrawOptions } from './util/draw';
export type { Descriptor } from './face/match'; export type { Descriptor } from './face/match';
export type { Box, Point } from './result'; export type { Box, Point } from './result';

View File

@ -4,7 +4,7 @@
* TensorFlow Tensor type * TensorFlow Tensor type
* @external * @external
*/ */
export { Tensor } from '@tensorflow/tfjs-core/dist/index'; export { Tensor, TensorLike } from '@tensorflow/tfjs-core/dist/index';
/** /**
* TensorFlow GraphModel type * TensorFlow GraphModel type

View File

@ -1,26 +1,26 @@
2021-11-10 12:12:57 INFO:  @vladmandic/human version 2.5.1 2021-11-11 11:24:03 INFO:  @vladmandic/human version 2.5.1
2021-11-10 12:12:57 INFO:  User: vlado Platform: linux Arch: x64 Node: v17.0.1 2021-11-11 11:24:03 INFO:  User: vlado Platform: linux Arch: x64 Node: v17.0.1
2021-11-10 12:12:57 INFO:  Application: {"name":"@vladmandic/human","version":"2.5.1"} 2021-11-11 11:24:03 INFO:  Application: {"name":"@vladmandic/human","version":"2.5.1"}
2021-11-10 12:12:57 INFO:  Environment: {"profile":"production","config":".build.json","package":"package.json","tsconfig":true,"eslintrc":true,"git":true} 2021-11-11 11:24:03 INFO:  Environment: {"profile":"production","config":".build.json","package":"package.json","tsconfig":true,"eslintrc":true,"git":true}
2021-11-10 12:12:57 INFO:  Toolchain: {"build":"0.6.3","esbuild":"0.13.13","typescript":"4.4.4","typedoc":"0.22.8","eslint":"8.2.0"} 2021-11-11 11:24:03 INFO:  Toolchain: {"build":"0.6.3","esbuild":"0.13.13","typescript":"4.4.4","typedoc":"0.22.8","eslint":"8.2.0"}
2021-11-10 12:12:57 INFO:  Build: {"profile":"production","steps":["clean","compile","typings","typedoc","lint","changelog"]} 2021-11-11 11:24:03 INFO:  Build: {"profile":"production","steps":["clean","compile","typings","typedoc","lint","changelog"]}
2021-11-10 12:12:57 STATE: Clean: {"locations":["dist/*","types/*","typedoc/*"]} 2021-11-11 11:24:03 STATE: Clean: {"locations":["dist/*","types/*","typedoc/*"]}
2021-11-10 12:12:57 STATE: Compile: {"name":"tfjs/nodejs/cpu","format":"cjs","platform":"node","input":"tfjs/tf-node.ts","output":"dist/tfjs.esm.js","files":1,"inputBytes":102,"outputBytes":1275} 2021-11-11 11:24:03 STATE: Compile: {"name":"tfjs/nodejs/cpu","format":"cjs","platform":"node","input":"tfjs/tf-node.ts","output":"dist/tfjs.esm.js","files":1,"inputBytes":102,"outputBytes":1275}
2021-11-10 12:12:57 STATE: Compile: {"name":"human/nodejs/cpu","format":"cjs","platform":"node","input":"src/human.ts","output":"dist/human.node.js","files":57,"inputBytes":527528,"outputBytes":445576} 2021-11-11 11:24:03 STATE: Compile: {"name":"human/nodejs/cpu","format":"cjs","platform":"node","input":"src/human.ts","output":"dist/human.node.js","files":57,"inputBytes":527509,"outputBytes":445576}
2021-11-10 12:12:57 STATE: Compile: {"name":"tfjs/nodejs/gpu","format":"cjs","platform":"node","input":"tfjs/tf-node-gpu.ts","output":"dist/tfjs.esm.js","files":1,"inputBytes":110,"outputBytes":1283} 2021-11-11 11:24:03 STATE: Compile: {"name":"tfjs/nodejs/gpu","format":"cjs","platform":"node","input":"tfjs/tf-node-gpu.ts","output":"dist/tfjs.esm.js","files":1,"inputBytes":110,"outputBytes":1283}
2021-11-10 12:12:57 STATE: Compile: {"name":"human/nodejs/gpu","format":"cjs","platform":"node","input":"src/human.ts","output":"dist/human.node-gpu.js","files":57,"inputBytes":527536,"outputBytes":445580} 2021-11-11 11:24:03 STATE: Compile: {"name":"human/nodejs/gpu","format":"cjs","platform":"node","input":"src/human.ts","output":"dist/human.node-gpu.js","files":57,"inputBytes":527517,"outputBytes":445580}
2021-11-10 12:12:57 STATE: Compile: {"name":"tfjs/nodejs/wasm","format":"cjs","platform":"node","input":"tfjs/tf-node-wasm.ts","output":"dist/tfjs.esm.js","files":1,"inputBytes":149,"outputBytes":1350} 2021-11-11 11:24:03 STATE: Compile: {"name":"tfjs/nodejs/wasm","format":"cjs","platform":"node","input":"tfjs/tf-node-wasm.ts","output":"dist/tfjs.esm.js","files":1,"inputBytes":149,"outputBytes":1350}
2021-11-10 12:12:57 STATE: Compile: {"name":"human/nodejs/wasm","format":"cjs","platform":"node","input":"src/human.ts","output":"dist/human.node-wasm.js","files":57,"inputBytes":527603,"outputBytes":445652} 2021-11-11 11:24:04 STATE: Compile: {"name":"human/nodejs/wasm","format":"cjs","platform":"node","input":"src/human.ts","output":"dist/human.node-wasm.js","files":57,"inputBytes":527584,"outputBytes":445652}
2021-11-10 12:12:57 STATE: Compile: {"name":"tfjs/browser/version","format":"esm","platform":"browser","input":"tfjs/tf-version.ts","output":"dist/tfjs.version.js","files":1,"inputBytes":1063,"outputBytes":1652} 2021-11-11 11:24:04 STATE: Compile: {"name":"tfjs/browser/version","format":"esm","platform":"browser","input":"tfjs/tf-version.ts","output":"dist/tfjs.version.js","files":1,"inputBytes":1063,"outputBytes":1652}
2021-11-10 12:12:57 STATE: Compile: {"name":"tfjs/browser/esm/nobundle","format":"esm","platform":"browser","input":"tfjs/tf-browser.ts","output":"dist/tfjs.esm.js","files":2,"inputBytes":2326,"outputBytes":912} 2021-11-11 11:24:04 STATE: Compile: {"name":"tfjs/browser/esm/nobundle","format":"esm","platform":"browser","input":"tfjs/tf-browser.ts","output":"dist/tfjs.esm.js","files":2,"inputBytes":2326,"outputBytes":912}
2021-11-10 12:12:57 STATE: Compile: {"name":"human/browser/esm/nobundle","format":"esm","platform":"browser","input":"src/human.ts","output":"dist/human.esm-nobundle.js","files":57,"inputBytes":527165,"outputBytes":447674} 2021-11-11 11:24:04 STATE: Compile: {"name":"human/browser/esm/nobundle","format":"esm","platform":"browser","input":"src/human.ts","output":"dist/human.esm-nobundle.js","files":57,"inputBytes":527146,"outputBytes":447674}
2021-11-10 12:12:57 STATE: Compile: {"name":"tfjs/browser/esm/custom","format":"esm","platform":"browser","input":"tfjs/tf-custom.ts","output":"dist/tfjs.esm.js","files":2,"inputBytes":2562703,"outputBytes":2497652} 2021-11-11 11:24:04 STATE: Compile: {"name":"tfjs/browser/esm/custom","format":"esm","platform":"browser","input":"tfjs/tf-custom.ts","output":"dist/tfjs.esm.js","files":2,"inputBytes":2562703,"outputBytes":2497652}
2021-11-10 12:12:58 STATE: Compile: {"name":"human/browser/iife/bundle","format":"iife","platform":"browser","input":"src/human.ts","output":"dist/human.js","files":57,"inputBytes":3023905,"outputBytes":1614842} 2021-11-11 11:24:04 STATE: Compile: {"name":"human/browser/iife/bundle","format":"iife","platform":"browser","input":"src/human.ts","output":"dist/human.js","files":57,"inputBytes":3023886,"outputBytes":1614842}
2021-11-10 12:12:58 STATE: Compile: {"name":"human/browser/esm/bundle","format":"esm","platform":"browser","input":"src/human.ts","output":"dist/human.esm.js","files":57,"inputBytes":3023905,"outputBytes":2950895} 2021-11-11 11:24:05 STATE: Compile: {"name":"human/browser/esm/bundle","format":"esm","platform":"browser","input":"src/human.ts","output":"dist/human.esm.js","files":57,"inputBytes":3023886,"outputBytes":2950895}
2021-11-10 12:13:20 STATE: Typings: {"input":"src/human.ts","output":"types","files":50} 2021-11-11 11:24:30 STATE: Typings: {"input":"src/human.ts","output":"types","files":50}
2021-11-10 12:13:27 STATE: TypeDoc: {"input":"src/human.ts","output":"typedoc","objects":49,"generated":true} 2021-11-11 11:24:38 STATE: TypeDoc: {"input":"src/human.ts","output":"typedoc","objects":50,"generated":true}
2021-11-10 12:13:27 STATE: Compile: {"name":"demo/typescript","format":"esm","platform":"browser","input":"demo/typescript/index.ts","output":"demo/typescript/index.js","files":1,"inputBytes":5801,"outputBytes":3822} 2021-11-11 11:24:38 STATE: Compile: {"name":"demo/typescript","format":"esm","platform":"browser","input":"demo/typescript/index.ts","output":"demo/typescript/index.js","files":1,"inputBytes":5801,"outputBytes":3822}
2021-11-10 12:13:27 STATE: Compile: {"name":"demo/facerecognition","format":"esm","platform":"browser","input":"demo/facerecognition/index.ts","output":"demo/facerecognition/index.js","files":1,"inputBytes":8949,"outputBytes":6529} 2021-11-11 11:24:38 STATE: Compile: {"name":"demo/faceid","format":"esm","platform":"browser","input":"demo/faceid/index.ts","output":"demo/faceid/index.js","files":2,"inputBytes":13534,"outputBytes":10091}
2021-11-10 12:14:05 STATE: Lint: {"locations":["*.json","src/**/*.ts","test/**/*.js","demo/**/*.js"],"files":90,"errors":0,"warnings":0} 2021-11-11 11:25:16 STATE: Lint: {"locations":["*.json","src/**/*.ts","test/**/*.js","demo/**/*.js"],"files":90,"errors":0,"warnings":0}
2021-11-10 12:14:06 STATE: ChangeLog: {"repository":"https://github.com/vladmandic/human","branch":"main","output":"CHANGELOG.md"} 2021-11-11 11:25:17 STATE: ChangeLog: {"repository":"https://github.com/vladmandic/human","branch":"main","output":"CHANGELOG.md"}
2021-11-10 12:14:06 INFO:  Done... 2021-11-11 11:25:17 INFO:  Done...

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -2,7 +2,7 @@ import type { Tensor } from './tfjs/types';
import type { env } from './util/env'; import type { env } from './util/env';
export * from './config'; export * from './config';
export * from './result'; export * from './result';
export type { Tensor } from './tfjs/types'; export type { Tensor, TensorLike } from './tfjs/types';
export type { DrawOptions } from './util/draw'; export type { DrawOptions } from './util/draw';
export type { Descriptor } from './face/match'; export type { Descriptor } from './face/match';
export type { Box, Point } from './result'; export type { Box, Point } from './result';

View File

@ -3,7 +3,7 @@
* TensorFlow Tensor type * TensorFlow Tensor type
* @external * @external
*/ */
export { Tensor } from '@tensorflow/tfjs-core/dist/index'; export { Tensor, TensorLike } from '@tensorflow/tfjs-core/dist/index';
/** /**
* TensorFlow GraphModel type * TensorFlow GraphModel type
* @external * @external

2
wiki

@ -1 +1 @@
Subproject commit 2a937c42e7539b7aa077a9f41085ca573bba7578 Subproject commit e26b155506e7981fa8187be228b5651de77ee8c6