refactor draw and models namespaces

pull/356/head
Vladimir Mandic 2022-11-17 14:39:02 -05:00
parent 8fe34fd723
commit 8d9190a773
31 changed files with 1510 additions and 1462 deletions

View File

@ -9,7 +9,10 @@
## Changelog ## Changelog
### **HEAD -> main** 2022/11/16 mandic00@live.com ### **HEAD -> main** 2022/11/17 mandic00@live.com
### **origin/main** 2022/11/16 mandic00@live.com
- added webcam id specification - added webcam id specification
- include external typedefs - include external typedefs

View File

@ -68,7 +68,7 @@
- **Full** [[*Live*]](https://vladmandic.github.io/human/demo/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo): Main browser demo app that showcases all Human capabilities - **Full** [[*Live*]](https://vladmandic.github.io/human/demo/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo): Main browser demo app that showcases all Human capabilities
- **Simple** [[*Live*]](https://vladmandic.github.io/human/demo/typescript/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/typescript): Simple demo in WebCam processing demo in TypeScript - **Simple** [[*Live*]](https://vladmandic.github.io/human/demo/typescript/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/typescript): Simple demo in WebCam processing demo in TypeScript
- **Embedded** [[*Live*]](https://vladmandic.github.io/human/demo/video/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/video/index.html): Even simpler demo with tiny code embedded in HTML file - **Embedded** [[*Live*]](https://vladmandic.github.io/human/demo/video/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/video/index.html): Even simpler demo with tiny code embedded in HTML file
- **Face Match** [[*Live*]](https://vladmandic.github.io/human/demo/facematch/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/facematch): Extract faces from images, calculates face descriptors and simmilarities and matches them to known database - **Face Match** [[*Live*]](https://vladmandic.github.io/human/demo/facematch/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/facematch): Extract faces from images, calculates face descriptors and similarities and matches them to known database
- **Face ID** [[*Live*]](https://vladmandic.github.io/human/demo/faceid/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/faceid): Runs multiple checks to validate webcam input before performing face match to faces in IndexDB - **Face ID** [[*Live*]](https://vladmandic.github.io/human/demo/faceid/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/faceid): Runs multiple checks to validate webcam input before performing face match to faces in IndexDB
- **Multi-thread** [[*Live*]](https://vladmandic.github.io/human/demo/multithread/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/multithread): Runs each Human module in a separate web worker for highest possible performance - **Multi-thread** [[*Live*]](https://vladmandic.github.io/human/demo/multithread/index.html) [[*Details*]](https://github.com/vladmandic/human/tree/main/demo/multithread): Runs each Human module in a separate web worker for highest possible performance
- **NextJS** [[*Live*]](https://vladmandic.github.io/human-next/out/index.html) [[*Details*]](https://github.com/vladmandic/human-next): Use Human with TypeScript, NextJS and ReactJS - **NextJS** [[*Live*]](https://vladmandic.github.io/human-next/out/index.html) [[*Details*]](https://github.com/vladmandic/human-next): Use Human with TypeScript, NextJS and ReactJS
@ -377,6 +377,16 @@ drawResults(); // start draw loop
And for even better results, you can run detection in a separate web worker thread And for even better results, you can run detection in a separate web worker thread
<br><hr><br>
## Detailed Usage
- [**Wiki Home**](https://github.com/vladmandic/human/wiki)
- [**List of all available methods, properies and namespaces**](https://github.com/vladmandic/human/wiki/Usage)
- [**TypeDoc API Specification - Main class**](https://vladmandic.github.io/human/typedoc/classes/Human.html)
- [**TypeDoc API Specification - Full**](https://vladmandic.github.io/human/typedoc/)
<br><hr><br> <br><hr><br>
## TypeDefs ## TypeDefs

11
TODO.md
View File

@ -80,6 +80,15 @@ Architecture:
- Upgrade to **TFJS 4.0** with **strong typing** - Upgrade to **TFJS 4.0** with **strong typing**
see [notes](https://github.com/vladmandic/human#typedefs) on how to use see [notes](https://github.com/vladmandic/human#typedefs) on how to use
- `TypeDef` refactoring - `TypeDef` refactoring
- Re-architect `human.models` namespace for better dynamic model handling
Added additional methods `load`, `list`, `loaded`, `reset`
- Add named export for improved bundler support when using non-default imports - Add named export for improved bundler support when using non-default imports
- Support for `NodeJS` v19 - Support for **NodeJS v19**
- Upgrade to **TypeScript 4.9** - Upgrade to **TypeScript 4.9**
Breaking changes:
- Replaced `result.face[n].iris` with `result.face[n].distance`
- Replaced `human.getModelStats()` with `human.models.stats()`
- Moved `human.similarity`, `human.distance` and `human.match` to namespace `human.match.*`
- Obsolete `human.enhance()`
- Obsolete `human.gl`

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -250,7 +250,7 @@ async function detectFace() {
} }
const db = await indexDb.load(); const db = await indexDb.load();
const descriptors = db.map((rec) => rec.descriptor).filter((desc) => desc.length > 0); const descriptors = db.map((rec) => rec.descriptor).filter((desc) => desc.length > 0);
const res = human.match(current.face.embedding, descriptors, matchOptions); const res = human.match.find(current.face.embedding, descriptors, matchOptions);
current.record = db[res.index] || null; current.record = db[res.index] || null;
if (current.record) { if (current.record) {
log(`best match: ${current.record.name} | id: ${current.record.id} | similarity: ${Math.round(1000 * res.similarity) / 10}%`); log(`best match: ${current.record.name} | id: ${current.record.id} | similarity: ${Math.round(1000 * res.similarity) / 10}%`);

View File

@ -11,7 +11,7 @@
## Browser Face Recognition Demo ## Browser Face Recognition Demo
- `demo/facematch`: Demo for Browsers that uses all face description and embedding features to - `demo/facematch`: Demo for Browsers that uses all face description and embedding features to
detect, extract and identify all faces plus calculate simmilarity between them detect, extract and identify all faces plus calculate similarity between them
It highlights functionality such as: It highlights functionality such as:

View File

@ -1,7 +1,7 @@
/** /**
* Human demo for browsers * Human demo for browsers
* *
* Demo for face descriptor analysis and face simmilarity analysis * Demo for face descriptor analysis and face similarity analysis
*/ */
/** @type {Human} */ /** @type {Human} */
@ -70,6 +70,9 @@ async function SelectFaceCanvas(face) {
document.getElementById('orig').style.filter = 'blur(16px)'; document.getElementById('orig').style.filter = 'blur(16px)';
if (face.tensor) { if (face.tensor) {
title('Sorting Faces by Similarity'); title('Sorting Faces by Similarity');
const c = document.getElementById('orig');
await human.tf.browser.toPixels(face.tensor, c);
/*
const enhanced = human.enhance(face); const enhanced = human.enhance(face);
if (enhanced) { if (enhanced) {
const c = document.getElementById('orig'); const c = document.getElementById('orig');
@ -81,8 +84,9 @@ async function SelectFaceCanvas(face) {
ctx.font = 'small-caps 0.4rem "Lato"'; ctx.font = 'small-caps 0.4rem "Lato"';
ctx.fillStyle = 'rgba(255, 255, 255, 1)'; ctx.fillStyle = 'rgba(255, 255, 255, 1)';
} }
*/
const arr = db.map((rec) => rec.embedding); const arr = db.map((rec) => rec.embedding);
const res = await human.match(face.embedding, arr); const res = await human.match.find(face.embedding, arr);
log('Match:', db[res.index].name); log('Match:', db[res.index].name);
const emotion = face.emotion[0] ? `${Math.round(100 * face.emotion[0].score)}% ${face.emotion[0].emotion}` : 'N/A'; const emotion = face.emotion[0] ? `${Math.round(100 * face.emotion[0].score)}% ${face.emotion[0].emotion}` : 'N/A';
document.getElementById('desc').innerHTML = ` document.getElementById('desc').innerHTML = `
@ -103,7 +107,7 @@ async function SelectFaceCanvas(face) {
for (const canvas of canvases) { for (const canvas of canvases) {
// calculate similarity from selected face to current one in the loop // calculate similarity from selected face to current one in the loop
const current = all[canvas.tag.sample][canvas.tag.face]; const current = all[canvas.tag.sample][canvas.tag.face];
const similarity = human.similarity(face.embedding, current.embedding); const similarity = human.match.similarity(face.embedding, current.embedding);
canvas.tag.similarity = similarity; canvas.tag.similarity = similarity;
// get best match // get best match
// draw the canvas // draw the canvas
@ -120,7 +124,7 @@ async function SelectFaceCanvas(face) {
ctx.font = 'small-caps 1rem "Lato"'; ctx.font = 'small-caps 1rem "Lato"';
const start = human.now(); const start = human.now();
const arr = db.map((rec) => rec.embedding); const arr = db.map((rec) => rec.embedding);
const res = await human.match(current.embedding, arr); const res = await human.match.find(current.embedding, arr);
time += (human.now() - start); time += (human.now() - start);
if (res.similarity > minScore) ctx.fillText(`DB: ${(100 * res.similarity).toFixed(1)}% ${db[res.index].name}`, 4, canvas.height - 30); if (res.similarity > minScore) ctx.fillText(`DB: ${(100 * res.similarity).toFixed(1)}% ${db[res.index].name}`, 4, canvas.height - 30);
} }
@ -161,7 +165,7 @@ async function AddFaceCanvas(index, res, fileName) {
ctx.fillStyle = 'rgba(255, 255, 255, 1)'; ctx.fillStyle = 'rgba(255, 255, 255, 1)';
ctx.fillText(`${res.face[i].age}y ${(100 * (res.face[i].genderScore || 0)).toFixed(1)}% ${res.face[i].gender}`, 4, canvas.height - 6); ctx.fillText(`${res.face[i].age}y ${(100 * (res.face[i].genderScore || 0)).toFixed(1)}% ${res.face[i].gender}`, 4, canvas.height - 6);
const arr = db.map((rec) => rec.embedding); const arr = db.map((rec) => rec.embedding);
const result = human.match(res.face[i].embedding, arr); const result = human.match.find(res.face[i].embedding, arr);
ctx.font = 'small-caps 1rem "Lato"'; ctx.font = 'small-caps 1rem "Lato"';
if (result.similarity && res.similarity > minScore) ctx.fillText(`${(100 * result.similarity).toFixed(1)}% ${db[result.index].name}`, 4, canvas.height - 30); if (result.similarity && res.similarity > minScore) ctx.fillText(`${(100 * result.similarity).toFixed(1)}% ${db[result.index].name}`, 4, canvas.height - 30);
document.getElementById('faces').appendChild(canvas); document.getElementById('faces').appendChild(canvas);
@ -256,7 +260,7 @@ async function main() {
title(''); title('');
log('Ready'); log('Ready');
human.validate(userConfig); human.validate(userConfig);
human.similarity([], []); human.match.similarity([], []);
} }
window.onload = main; window.onload = main;

View File

@ -222,21 +222,13 @@ async function calcSimmilarity(result) {
compare.original = result; compare.original = result;
log('setting face compare baseline:', result.face[0]); log('setting face compare baseline:', result.face[0]);
if (result.face[0].tensor) { if (result.face[0].tensor) {
const enhanced = human.enhance(result.face[0]); const c = document.getElementById('orig');
if (enhanced) { human.tf.browser.toPixels(result.face[0].tensor, c);
const c = document.getElementById('orig');
const squeeze = human.tf.squeeze(enhanced);
const norm = human.tf.div(squeeze, 255);
human.tf.browser.toPixels(norm, c);
human.tf.dispose(enhanced);
human.tf.dispose(squeeze);
human.tf.dispose(norm);
}
} else { } else {
document.getElementById('compare-canvas').getContext('2d').drawImage(compare.original.canvas, 0, 0, 200, 200); document.getElementById('compare-canvas').getContext('2d').drawImage(compare.original.canvas, 0, 0, 200, 200);
} }
} }
const similarity = human.similarity(compare.original.face[0].embedding, result.face[0].embedding); const similarity = human.match.similarity(compare.original.face[0].embedding, result.face[0].embedding);
document.getElementById('similarity').innerText = `similarity: ${Math.trunc(1000 * similarity) / 10}%`; document.getElementById('similarity').innerText = `similarity: ${Math.trunc(1000 * similarity) / 10}%`;
} }

View File

@ -82,7 +82,7 @@ node demo/nodejs/node.js
detector: { modelPath: 'handdetect.json' }, detector: { modelPath: 'handdetect.json' },
skeleton: { modelPath: 'handskeleton.json' } skeleton: { modelPath: 'handskeleton.json' }
}, },
object: { enabled: true, modelPath: 'mb3-centernet.json', minConfidence: 0.2, iouThreshold: 0.4, maxDetected: 10, skipFrames: 19 } object: { enabled: true, modelPath: 'centernet.json', minConfidence: 0.2, iouThreshold: 0.4, maxDetected: 10, skipFrames: 19 }
} }
08:52:15.673 Human: version: 2.0.0 08:52:15.673 Human: version: 2.0.0
08:52:15.674 Human: tfjs version: 3.6.0 08:52:15.674 Human: tfjs version: 3.6.0
@ -96,7 +96,7 @@ node demo/nodejs/node.js
08:52:15.847 Human: load model: file://models/handdetect.json 08:52:15.847 Human: load model: file://models/handdetect.json
08:52:15.847 Human: load model: file://models/handskeleton.json 08:52:15.847 Human: load model: file://models/handskeleton.json
08:52:15.914 Human: load model: file://models/movenet-lightning.json 08:52:15.914 Human: load model: file://models/movenet-lightning.json
08:52:15.957 Human: load model: file://models/mb3-centernet.json 08:52:15.957 Human: load model: file://models/centernet.json
08:52:16.015 Human: load model: file://models/faceres.json 08:52:16.015 Human: load model: file://models/faceres.json
08:52:16.015 Human: tf engine state: 50796152 bytes 1318 tensors 08:52:16.015 Human: tf engine state: 50796152 bytes 1318 tensors
2021-06-01 08:52:16 INFO: Loaded: [ 'face', 'movenet', 'handpose', 'emotion', 'centernet', 'faceres', [length]: 6 ] 2021-06-01 08:52:16 INFO: Loaded: [ 'face', 'movenet', 'handpose', 'emotion', 'centernet', 'faceres', [length]: 6 ]

View File

@ -57,7 +57,7 @@ async function main() {
if (!res1 || !res1.face || res1.face.length === 0 || !res2 || !res2.face || res2.face.length === 0) { if (!res1 || !res1.face || res1.face.length === 0 || !res2 || !res2.face || res2.face.length === 0) {
throw new Error('Could not detect face descriptors'); throw new Error('Could not detect face descriptors');
} }
const similarity = human.similarity(res1.face[0].embedding, res2.face[0].embedding, { order: 2 }); const similarity = human.match.similarity(res1.face[0].embedding, res2.face[0].embedding, { order: 2 });
log.data('Similarity: ', similarity); log.data('Similarity: ', similarity);
} }

View File

@ -52,7 +52,7 @@ async function main() {
log('platform:', human.env.platform, '| agent:', human.env.agent); log('platform:', human.env.platform, '| agent:', human.env.agent);
await human.load(); // preload all models await human.load(); // preload all models
log('backend:', human.tf.getBackend(), '| available:', human.env.backends); log('backend:', human.tf.getBackend(), '| available:', human.env.backends);
log('models stats:', human.getModelStats()); log('models stats:', human.models.stats());
log('models loaded:', Object.values(human.models).filter((model) => model !== null).length); log('models loaded:', Object.values(human.models).filter((model) => model !== null).length);
await human.warmup(); // warmup function to initialize backend for future faster detection await human.warmup(); // warmup function to initialize backend for future faster detection
const numTensors = human.tf.engine().state.numTensors; const numTensors = human.tf.engine().state.numTensors;

View File

@ -4,100 +4,6 @@
author: <https://github.com/vladmandic>' author: <https://github.com/vladmandic>'
*/ */
import*as m from"../../dist/human.esm.js";var f=1920,b={modelBasePath:"../../models",filter:{enabled:!0,equalization:!1,flip:!1,width:f},face:{enabled:!0,detector:{rotation:!0},mesh:{enabled:!0},attention:{enabled:!1},iris:{enabled:!0},description:{enabled:!0},emotion:{enabled:!0},antispoof:{enabled:!0},liveness:{enabled:!0}},body:{enabled:!0},hand:{enabled:!1},object:{enabled:!1},segmentation:{enabled:!1},gesture:{enabled:!0}},e=new m.Human(b);e.env.perfadd=!1;e.draw.options.font='small-caps 18px "Lato"';e.draw.options.lineHeight=20;var a={video:document.getElementById("video"),canvas:document.getElementById("canvas"),log:document.getElementById("log"),fps:document.getElementById("status"),perf:document.getElementById("performance")},n={detect:0,draw:0,tensors:0,start:0},s={detectFPS:0,drawFPS:0,frames:0,averageMs:0},o=(...t)=>{a.log.innerText+=t.join(" ")+`
// demo/typescript/index.ts `,console.log(...t)},r=t=>a.fps.innerText=t,g=t=>a.perf.innerText="tensors:"+e.tf.memory().numTensors.toString()+" | performance: "+JSON.stringify(t).replace(/"|{|}/g,"").replace(/,/g," | ");async function u(){if(!a.video.paused){n.start===0&&(n.start=e.now()),await e.detect(a.video);let t=e.tf.memory().numTensors;t-n.tensors!==0&&o("allocated tensors:",t-n.tensors),n.tensors=t,s.detectFPS=Math.round(1e3*1e3/(e.now()-n.detect))/1e3,s.frames++,s.averageMs=Math.round(1e3*(e.now()-n.start)/s.frames)/1e3,s.frames%100===0&&!a.video.paused&&o("performance",{...s,tensors:n.tensors})}n.detect=e.now(),requestAnimationFrame(u)}async function p(){var d,i,c;if(!a.video.paused){let l=e.next(e.result),w=await e.image(a.video);e.draw.canvas(w.canvas,a.canvas);let v={bodyLabels:`person confidence [score] and ${(c=(i=(d=e.result)==null?void 0:d.body)==null?void 0:i[0])==null?void 0:c.keypoints.length} keypoints`};await e.draw.all(a.canvas,l,v),g(l.performance)}let t=e.now();s.drawFPS=Math.round(1e3*1e3/(t-n.draw))/1e3,n.draw=t,r(a.video.paused?"paused":`fps: ${s.detectFPS.toFixed(1).padStart(5," ")} detect | ${s.drawFPS.toFixed(1).padStart(5," ")} draw`),setTimeout(p,30)}async function h(){let d=(await e.webcam.enumerate())[0].deviceId;await e.webcam.start({element:a.video,crop:!0,width:f,id:d}),a.canvas.width=e.webcam.width,a.canvas.height=e.webcam.height,a.canvas.onclick=async()=>{e.webcam.paused?await e.webcam.play():e.webcam.pause()}}async function y(){o("human version:",e.version,"| tfjs version:",e.tf.version["tfjs-core"]),o("platform:",e.env.platform,"| agent:",e.env.agent),r("loading..."),await e.load(),o("backend:",e.tf.getBackend(),"| available:",e.env.backends),o("models stats:",e.models.stats()),o("models loaded:",Object.values(e.models).filter(t=>t!==null).length),o("environment",e.env),r("initializing..."),await e.warmup(),await h(),await u(),await p()}window.onload=y;
import * as H from "../../dist/human.esm.js";
var width = 1920;
var humanConfig = {
modelBasePath: "../../models",
filter: { enabled: true, equalization: false, flip: false, width },
face: { enabled: true, detector: { rotation: true }, mesh: { enabled: true }, attention: { enabled: false }, iris: { enabled: true }, description: { enabled: true }, emotion: { enabled: true }, antispoof: { enabled: true }, liveness: { enabled: true } },
body: { enabled: true },
hand: { enabled: false },
object: { enabled: false },
segmentation: { enabled: false },
gesture: { enabled: true }
};
var human = new H.Human(humanConfig);
human.env.perfadd = false;
human.draw.options.font = 'small-caps 18px "Lato"';
human.draw.options.lineHeight = 20;
var dom = {
video: document.getElementById("video"),
canvas: document.getElementById("canvas"),
log: document.getElementById("log"),
fps: document.getElementById("status"),
perf: document.getElementById("performance")
};
var timestamp = { detect: 0, draw: 0, tensors: 0, start: 0 };
var fps = { detectFPS: 0, drawFPS: 0, frames: 0, averageMs: 0 };
var log = (...msg) => {
dom.log.innerText += msg.join(" ") + "\n";
console.log(...msg);
};
var status = (msg) => dom.fps.innerText = msg;
var perf = (msg) => dom.perf.innerText = "tensors:" + human.tf.memory().numTensors.toString() + " | performance: " + JSON.stringify(msg).replace(/"|{|}/g, "").replace(/,/g, " | ");
async function detectionLoop() {
if (!dom.video.paused) {
if (timestamp.start === 0)
timestamp.start = human.now();
await human.detect(dom.video);
const tensors = human.tf.memory().numTensors;
if (tensors - timestamp.tensors !== 0)
log("allocated tensors:", tensors - timestamp.tensors);
timestamp.tensors = tensors;
fps.detectFPS = Math.round(1e3 * 1e3 / (human.now() - timestamp.detect)) / 1e3;
fps.frames++;
fps.averageMs = Math.round(1e3 * (human.now() - timestamp.start) / fps.frames) / 1e3;
if (fps.frames % 100 === 0 && !dom.video.paused)
log("performance", { ...fps, tensors: timestamp.tensors });
}
timestamp.detect = human.now();
requestAnimationFrame(detectionLoop);
}
async function drawLoop() {
var _a, _b, _c;
if (!dom.video.paused) {
const interpolated = human.next(human.result);
const processed = await human.image(dom.video);
human.draw.canvas(processed.canvas, dom.canvas);
const opt = { bodyLabels: `person confidence [score] and ${(_c = (_b = (_a = human.result) == null ? void 0 : _a.body) == null ? void 0 : _b[0]) == null ? void 0 : _c.keypoints.length} keypoints` };
await human.draw.all(dom.canvas, interpolated, opt);
perf(interpolated.performance);
}
const now = human.now();
fps.drawFPS = Math.round(1e3 * 1e3 / (now - timestamp.draw)) / 1e3;
timestamp.draw = now;
status(dom.video.paused ? "paused" : `fps: ${fps.detectFPS.toFixed(1).padStart(5, " ")} detect | ${fps.drawFPS.toFixed(1).padStart(5, " ")} draw`);
setTimeout(drawLoop, 30);
}
async function webCam() {
const devices = await human.webcam.enumerate();
const id = devices[0].deviceId;
await human.webcam.start({ element: dom.video, crop: true, width, id });
dom.canvas.width = human.webcam.width;
dom.canvas.height = human.webcam.height;
dom.canvas.onclick = async () => {
if (human.webcam.paused)
await human.webcam.play();
else
human.webcam.pause();
};
}
async function main() {
log("human version:", human.version, "| tfjs version:", human.tf.version["tfjs-core"]);
log("platform:", human.env.platform, "| agent:", human.env.agent);
status("loading...");
await human.load();
log("backend:", human.tf.getBackend(), "| available:", human.env.backends);
log("models stats:", human.getModelStats());
log("models loaded:", Object.values(human.models).filter((model) => model !== null).length);
log("environment", human.env);
status("initializing...");
await human.warmup();
await webCam();
await detectionLoop();
await drawLoop();
}
window.onload = main;
//# sourceMappingURL=index.js.map //# sourceMappingURL=index.js.map

File diff suppressed because one or more lines are too long

View File

@ -100,7 +100,7 @@ async function main() { // main entry point
status('loading...'); status('loading...');
await human.load(); // preload all models await human.load(); // preload all models
log('backend:', human.tf.getBackend(), '| available:', human.env.backends); log('backend:', human.tf.getBackend(), '| available:', human.env.backends);
log('models stats:', human.getModelStats()); log('models stats:', human.models.stats());
log('models loaded:', Object.values(human.models).filter((model) => model !== null).length); log('models loaded:', Object.values(human.models).filter((model) => model !== null).length);
log('environment', human.env); log('environment', human.env);
status('initializing...'); status('initializing...');

BIN
models/centernet.bin Normal file

Binary file not shown.

577
models/centernet.json Normal file

File diff suppressed because one or more lines are too long

View File

@ -1,6 +1,7 @@
{ {
"antispoof": 853098, "antispoof": 853098,
"blazeface": 538928, "blazeface": 538928,
"centernet": 4030290,
"emotion": 820516, "emotion": 820516,
"facemesh": 1477958, "facemesh": 1477958,
"faceres": 6978814, "faceres": 6978814,
@ -8,7 +9,6 @@
"handtrack": 2964837, "handtrack": 2964837,
"iris": 2599092, "iris": 2599092,
"liveness": 592976, "liveness": 592976,
"mb3-centernet": 4030290,
"models": 0, "models": 0,
"movenet-lightning": 4650216, "movenet-lightning": 4650216,
"age": 161240, "age": 161240,

View File

@ -449,7 +449,7 @@ const config: Config = {
}, },
object: { object: {
enabled: false, enabled: false,
modelPath: 'mb3-centernet.json', modelPath: 'centernet.json',
minConfidence: 0.2, minConfidence: 0.2,
iouThreshold: 0.4, iouThreshold: 0.4,
maxDetected: 10, maxDetected: 10,

View File

@ -2,7 +2,7 @@
* FaceRes model implementation * FaceRes model implementation
* *
* Returns Age, Gender, Descriptor * Returns Age, Gender, Descriptor
* Implements Face simmilarity function * Implements Face similarity function
* *
* Based on: [**HSE-FaceRes**](https://github.com/HSE-asavchenko/HSE_FaceRec_tf) * Based on: [**HSE-FaceRes**](https://github.com/HSE-asavchenko/HSE_FaceRec_tf)
*/ */

View File

@ -22,11 +22,9 @@ import * as centernet from './object/centernet';
import * as efficientpose from './body/efficientpose'; import * as efficientpose from './body/efficientpose';
import * as face from './face/face'; import * as face from './face/face';
import * as facemesh from './face/facemesh'; import * as facemesh from './face/facemesh';
import * as faceres from './face/faceres';
import * as gesture from './gesture/gesture'; import * as gesture from './gesture/gesture';
import * as handpose from './hand/handpose'; import * as handpose from './hand/handpose';
import * as handtrack from './hand/handtrack'; import * as handtrack from './hand/handtrack';
import * as humangl from './tfjs/humangl';
import * as image from './image/image'; import * as image from './image/image';
import * as interpolate from './util/interpolate'; import * as interpolate from './util/interpolate';
import * as meet from './segmentation/meet'; import * as meet from './segmentation/meet';
@ -41,7 +39,7 @@ import * as selfie from './segmentation/selfie';
import * as warmups from './warmup'; import * as warmups from './warmup';
// type definitions // type definitions
import { Input, DrawOptions, Config, Result, FaceResult, HandResult, BodyResult, ObjectResult, GestureResult, PersonResult, AnyCanvas, emptyResult } from './exports'; import { Input, DrawOptions, Config, Result, FaceResult, HandResult, BodyResult, ObjectResult, GestureResult, AnyCanvas, emptyResult } from './exports';
import type { Tensor, Tensor4D } from './tfjs/types'; import type { Tensor, Tensor4D } from './tfjs/types';
// type exports // type exports
export * from './exports'; export * from './exports';
@ -94,7 +92,15 @@ export class Human {
* - options: are global settings for all draw operations, can be overriden for each draw method {@link DrawOptions} * - options: are global settings for all draw operations, can be overriden for each draw method {@link DrawOptions}
* - face, body, hand, gesture, object, person: draws detected results as overlays on canvas * - face, body, hand, gesture, object, person: draws detected results as overlays on canvas
*/ */
draw: { canvas: typeof draw.canvas, face: typeof draw.face, body: typeof draw.body, hand: typeof draw.hand, gesture: typeof draw.gesture, object: typeof draw.object, person: typeof draw.person, all: typeof draw.all, options: DrawOptions }; // draw: { canvas: typeof draw.canvas, face: typeof draw.face, body: typeof draw.body, hand: typeof draw.hand, gesture: typeof draw.gesture, object: typeof draw.object, person: typeof draw.person, all: typeof draw.all, options: DrawOptions };
draw: typeof draw = draw;
/** Face Matching
* - similarity: compare two face descriptors and return similarity index
* - distance: compare two face descriptors and return raw calculated differences
* - find: compare face descriptor to array of face descriptors and return best match
*/
match: typeof match = match;
/** Currently loaded models /** Currently loaded models
* @internal * @internal
@ -121,8 +127,6 @@ export class Human {
#numTensors: number; #numTensors: number;
#analyzeMemoryLeaks: boolean; #analyzeMemoryLeaks: boolean;
#checkSanity: boolean; #checkSanity: boolean;
/** WebGL debug info */
gl: Record<string, unknown>;
// definition end // definition end
/** Constructor for **Human** library that is futher used for all operations /** Constructor for **Human** library that is futher used for all operations
@ -153,28 +157,15 @@ export class Human {
this.performance = {}; this.performance = {};
this.events = (typeof EventTarget !== 'undefined') ? new EventTarget() : undefined; this.events = (typeof EventTarget !== 'undefined') ? new EventTarget() : undefined;
// object that contains all initialized models // object that contains all initialized models
this.models = new models.Models(); this.models = new models.Models(this);
// reexport draw methods // reexport draw methods
draw.init(); draw.init();
this.draw = {
options: draw.options,
canvas: (input: AnyCanvas | HTMLImageElement | HTMLVideoElement, output: AnyCanvas) => draw.canvas(input, output),
face: (output: AnyCanvas, result: FaceResult[], options?: Partial<DrawOptions>) => draw.face(output, result, options),
body: (output: AnyCanvas, result: BodyResult[], options?: Partial<DrawOptions>) => draw.body(output, result, options),
hand: (output: AnyCanvas, result: HandResult[], options?: Partial<DrawOptions>) => draw.hand(output, result, options),
gesture: (output: AnyCanvas, result: GestureResult[], options?: Partial<DrawOptions>) => draw.gesture(output, result, options),
object: (output: AnyCanvas, result: ObjectResult[], options?: Partial<DrawOptions>) => draw.object(output, result, options),
person: (output: AnyCanvas, result: PersonResult[], options?: Partial<DrawOptions>) => draw.person(output, result, options),
all: (output: AnyCanvas, result: Result, options?: Partial<DrawOptions>) => draw.all(output, result, options),
};
this.result = emptyResult(); this.result = emptyResult();
// export access to image processing // export access to image processing
this.process = { tensor: null, canvas: null }; this.process = { tensor: null, canvas: null };
// export raw access to underlying models // export raw access to underlying models
this.faceTriangulation = facemesh.triangulation; this.faceTriangulation = facemesh.triangulation;
this.faceUVMap = facemesh.uvmap; this.faceUVMap = facemesh.uvmap;
// set gl info
this.gl = humangl.config;
// init model validation // init model validation
models.validateModel(this, null, ''); models.validateModel(this, null, '');
// include platform info // include platform info
@ -227,18 +218,6 @@ export class Human {
return msgs; return msgs;
} }
/** Check model for invalid kernel ops for current backend */
check() {
return models.validate(this);
}
/** Exports face matching methods {@link match#similarity} */
public similarity = match.similarity;
/** Exports face matching methods {@link match#distance} */
public distance = match.distance;
/** Exports face matching methods {@link match#match} */
public match = match.match;
/** Utility wrapper for performance.now() */ /** Utility wrapper for performance.now() */
now(): number { // eslint-disable-line class-methods-use-this now(): number { // eslint-disable-line class-methods-use-this
return now(); return now();
@ -273,16 +252,7 @@ export class Human {
return tensor; return tensor;
} }
/** Enhance method performs additional enhacements to face image previously detected for futher processing /** Compare two input tensors for pixel similarity
*
* @param input - Tensor as provided in human.result.face[n].tensor
* @returns Tensor
*/
enhance(input: Tensor): Tensor | null { // eslint-disable-line class-methods-use-this
return faceres.enhance(input);
}
/** Compare two input tensors for pixel simmilarity
* - use `human.image` to process any valid input and get a tensor that can be used for compare * - use `human.image` to process any valid input and get a tensor that can be used for compare
* - when passing manually generated tensors: * - when passing manually generated tensors:
* - both input tensors must be in format [1, height, width, 3] * - both input tensors must be in format [1, height, width, 3]
@ -325,18 +295,17 @@ export class Human {
await tf.ready(); await tf.ready();
if (this.env.browser) { if (this.env.browser) {
if (this.config.debug) log('configuration:', this.config); if (this.config.debug) log('configuration:', this.config);
// @ts-ignore private property
if (this.config.debug) log('tf flags:', this.tf.ENV.flags); if (this.config.debug) log('tf flags:', this.tf.ENV.flags);
} }
} }
await models.load(this); // actually loads models await this.models.load(); // actually loads models
if (this.env.initial && this.config.debug) log('tf engine state:', this.tf.engine().state.numBytes, 'bytes', this.tf.engine().state.numTensors, 'tensors'); // print memory stats on first run if (this.env.initial && this.config.debug) log('tf engine state:', this.tf.engine().state.numBytes, 'bytes', this.tf.engine().state.numTensors, 'tensors'); // print memory stats on first run
this.env.initial = false; this.env.initial = false;
const loaded = Object.values(this.models).filter((model) => model).length; const loaded = Object.values(this.models).filter((model) => model).length;
if (loaded !== count) { // number of loaded models changed if (loaded !== count) { // number of loaded models changed
models.validate(this); // validate kernel ops used by model against current backend this.models.validate(); // validate kernel ops used by model against current backend
this.emit('load'); this.emit('load');
} }
@ -359,9 +328,6 @@ export class Human {
return interpolate.calc(result, this.config); return interpolate.calc(result, this.config);
} }
/** get model loading/loaded stats */
getModelStats(): models.ModelStats { return models.getModelStats(this); }
/** Warmup method pre-initializes all configured models for faster inference /** Warmup method pre-initializes all configured models for faster inference
* - can take significant time on startup * - can take significant time on startup
* - only used for `webgl` and `humangl` backends * - only used for `webgl` and `humangl` backends

View File

@ -31,136 +31,10 @@ import { modelStats, ModelInfo } from './tfjs/load';
import type { GraphModel } from './tfjs/types'; import type { GraphModel } from './tfjs/types';
import type { Human } from './human'; import type { Human } from './human';
/** Instances of all possible TFJS Graph Models used by Human
* - loaded as needed based on configuration
* - initialized explictly with `human.load()` method
* - initialized implicity on first call to `human.detect()`
* - each model can be `null` if not loaded, instance of `GraphModel` if loaded or `Promise` if loading
*/
export class Models {
ssrnetage: null | GraphModel | Promise<GraphModel> = null;
gear: null | GraphModel | Promise<GraphModel> = null;
blazeposedetect: null | GraphModel | Promise<GraphModel> = null;
blazepose: null | GraphModel | Promise<GraphModel> = null;
centernet: null | GraphModel | Promise<GraphModel> = null;
efficientpose: null | GraphModel | Promise<GraphModel> = null;
mobilefacenet: null | GraphModel | Promise<GraphModel> = null;
insightface: null | GraphModel | Promise<GraphModel> = null;
emotion: null | GraphModel | Promise<GraphModel> = null;
facedetect: null | GraphModel | Promise<GraphModel> = null;
faceiris: null | GraphModel | Promise<GraphModel> = null;
facemesh: null | GraphModel | Promise<GraphModel> = null;
faceres: null | GraphModel | Promise<GraphModel> = null;
ssrnetgender: null | GraphModel | Promise<GraphModel> = null;
handpose: null | GraphModel | Promise<GraphModel> = null;
handskeleton: null | GraphModel | Promise<GraphModel> = null;
handtrack: null | GraphModel | Promise<GraphModel> = null;
liveness: null | GraphModel | Promise<GraphModel> = null;
meet: null | GraphModel | Promise<GraphModel> = null;
movenet: null | GraphModel | Promise<GraphModel> = null;
nanodet: null | GraphModel | Promise<GraphModel> = null;
posenet: null | GraphModel | Promise<GraphModel> = null;
selfie: null | GraphModel | Promise<GraphModel> = null;
rvm: null | GraphModel | Promise<GraphModel> = null;
antispoof: null | GraphModel | Promise<GraphModel> = null;
}
/** structure that holds global stats for currently loaded models */
export interface ModelStats {
numLoadedModels: number,
numDefinedModels: number,
percentageLoaded: number,
totalSizeFromManifest: number,
totalSizeWeights: number,
totalSizeLoading: number,
totalSizeEnabled: undefined,
modelStats: ModelInfo[],
}
let instance: Human;
export const getModelStats = (currentInstance: Human): ModelStats => {
if (currentInstance) instance = currentInstance;
if (!instance) log('instance not registred');
let totalSizeFromManifest = 0;
let totalSizeWeights = 0;
let totalSizeLoading = 0;
for (const m of Object.values(modelStats)) {
totalSizeFromManifest += m.sizeFromManifest;
totalSizeWeights += m.sizeLoadedWeights;
totalSizeLoading += m.sizeDesired;
}
const percentageLoaded = totalSizeLoading > 0 ? totalSizeWeights / totalSizeLoading : 0;
return {
numLoadedModels: Object.values(modelStats).length,
numDefinedModels: Object.keys(instance.models).length,
percentageLoaded,
totalSizeFromManifest,
totalSizeWeights,
totalSizeLoading,
totalSizeEnabled: undefined,
modelStats: Object.values(modelStats),
};
};
export function reset(currentInstance: Human): void {
if (currentInstance) instance = currentInstance;
// if (instance.config.debug) log('resetting loaded models');
for (const model of Object.keys(instance.models)) instance.models[model as keyof Models] = null;
}
/** Load method preloads all instance.configured models on-demand */
export async function load(currentInstance: Human): Promise<void> {
if (currentInstance) instance = currentInstance;
if (!instance) log('instance not registred');
if (env.initial) reset(instance);
if (instance.config.hand.enabled) { // handpose model is a combo that must be loaded as a whole
if (!instance.models.handpose && instance.config.hand.detector?.modelPath?.includes('handdetect')) {
[instance.models.handpose, instance.models.handskeleton] = await handpose.load(instance.config);
}
if (!instance.models.handskeleton && instance.config.hand.landmarks && instance.config.hand.detector?.modelPath?.includes('handdetect')) {
[instance.models.handpose, instance.models.handskeleton] = await handpose.load(instance.config);
}
}
if (instance.config.body.enabled && !instance.models.blazepose && instance.config.body.modelPath?.includes('blazepose')) instance.models.blazepose = blazepose.loadPose(instance.config);
if (instance.config.body.enabled && !instance.models.blazeposedetect && instance.config.body['detector'] && instance.config.body['detector'].modelPath) instance.models.blazeposedetect = blazepose.loadDetect(instance.config);
if (instance.config.body.enabled && !instance.models.efficientpose && instance.config.body.modelPath?.includes('efficientpose')) instance.models.efficientpose = efficientpose.load(instance.config);
if (instance.config.body.enabled && !instance.models.movenet && instance.config.body.modelPath?.includes('movenet')) instance.models.movenet = movenet.load(instance.config);
if (instance.config.body.enabled && !instance.models.posenet && instance.config.body.modelPath?.includes('posenet')) instance.models.posenet = posenet.load(instance.config);
if (instance.config.face.enabled && !instance.models.facedetect) instance.models.facedetect = blazeface.load(instance.config);
if (instance.config.face.enabled && instance.config.face.antispoof?.enabled && !instance.models.antispoof) instance.models.antispoof = antispoof.load(instance.config);
if (instance.config.face.enabled && instance.config.face.liveness?.enabled && !instance.models.liveness) instance.models.liveness = liveness.load(instance.config);
if (instance.config.face.enabled && instance.config.face.description?.enabled && !instance.models.faceres) instance.models.faceres = faceres.load(instance.config);
if (instance.config.face.enabled && instance.config.face.emotion?.enabled && !instance.models.emotion) instance.models.emotion = emotion.load(instance.config);
if (instance.config.face.enabled && instance.config.face.iris?.enabled && !instance.config.face.attention?.enabled && !instance.models.faceiris) instance.models.faceiris = iris.load(instance.config);
if (instance.config.face.enabled && instance.config.face.mesh?.enabled && (!instance.models.facemesh)) instance.models.facemesh = facemesh.load(instance.config);
if (instance.config.face.enabled && instance.config.face['gear']?.enabled && !instance.models.gear) instance.models.gear = gear.load(instance.config);
if (instance.config.face.enabled && instance.config.face['ssrnet']?.enabled && !instance.models.ssrnetage) instance.models.ssrnetage = ssrnetAge.load(instance.config);
if (instance.config.face.enabled && instance.config.face['ssrnet']?.enabled && !instance.models.ssrnetgender) instance.models.ssrnetgender = ssrnetGender.load(instance.config);
if (instance.config.face.enabled && instance.config.face['mobilefacenet']?.enabled && !instance.models.mobilefacenet) instance.models.mobilefacenet = mobilefacenet.load(instance.config);
if (instance.config.face.enabled && instance.config.face['insightface']?.enabled && !instance.models.insightface) instance.models.insightface = insightface.load(instance.config);
if (instance.config.hand.enabled && !instance.models.handtrack && instance.config.hand.detector?.modelPath?.includes('handtrack')) instance.models.handtrack = handtrack.loadDetect(instance.config);
if (instance.config.hand.enabled && instance.config.hand.landmarks && !instance.models.handskeleton && instance.config.hand.detector?.modelPath?.includes('handtrack')) instance.models.handskeleton = handtrack.loadSkeleton(instance.config);
if (instance.config.object.enabled && !instance.models.centernet && instance.config.object.modelPath?.includes('centernet')) instance.models.centernet = centernet.load(instance.config);
if (instance.config.object.enabled && !instance.models.nanodet && instance.config.object.modelPath?.includes('nanodet')) instance.models.nanodet = nanodet.load(instance.config);
if (instance.config.segmentation.enabled && !instance.models.selfie && instance.config.segmentation.modelPath?.includes('selfie')) instance.models.selfie = selfie.load(instance.config);
if (instance.config.segmentation.enabled && !instance.models.meet && instance.config.segmentation.modelPath?.includes('meet')) instance.models.meet = meet.load(instance.config);
if (instance.config.segmentation.enabled && !instance.models.rvm && instance.config.segmentation.modelPath?.includes('rvm')) instance.models.rvm = rvm.load(instance.config);
// models are loaded in parallel asynchronously so lets wait until they are actually loaded
for await (const model of Object.keys(instance.models)) {
if (instance.models[model as keyof Models] && typeof instance.models[model as keyof Models] !== 'undefined') {
instance.models[model as keyof Models] = await instance.models[model as keyof Models];
}
}
}
export interface KernelOps { name: string, url: string, missing: string[], ops: string[] } export interface KernelOps { name: string, url: string, missing: string[], ops: string[] }
export function validateModel(currentInstance: Human | null, model: GraphModel | null, name: string): KernelOps | null { export function validateModel(instance: Human | null, model: GraphModel | null, name: string): KernelOps | null {
if (!model) return null; if (!model) return null;
if (currentInstance) instance = currentInstance;
if (!instance) log('instance not registred');
if (!instance?.config?.validateModels) return null; if (!instance?.config?.validateModels) return null;
const simpleOps = ['const', 'placeholder', 'noop', 'pad', 'squeeze', 'add', 'sub', 'mul', 'div']; const simpleOps = ['const', 'placeholder', 'noop', 'pad', 'squeeze', 'add', 'sub', 'mul', 'div'];
const ignoreOps = ['biasadd', 'fusedbatchnormv3', 'matmul', 'switch', 'shape', 'merge', 'split', 'broadcastto']; const ignoreOps = ['biasadd', 'fusedbatchnormv3', 'matmul', 'switch', 'shape', 'merge', 'split', 'broadcastto'];
@ -193,15 +67,124 @@ export function validateModel(currentInstance: Human | null, model: GraphModel |
return missing.length > 0 ? { name, missing, ops, url } : null; return missing.length > 0 ? { name, missing, ops, url } : null;
} }
export function validate(currentInstance: Human): { name: string, missing: string[] }[] { /** structure that holds global stats for currently loaded models */
if (currentInstance) instance = currentInstance; export interface ModelStats {
if (!instance) log('instance not registred'); numLoadedModels: number,
const missing: KernelOps[] = []; numDefinedModels: number,
for (const defined of Object.keys(currentInstance.models)) { percentageLoaded: number,
const model: GraphModel | null = currentInstance.models[defined as keyof Models] as GraphModel | null; totalSizeFromManifest: number,
if (!model) continue; totalSizeWeights: number,
const res = validateModel(currentInstance, model, defined); totalSizeLoading: number,
if (res) missing.push(res); modelStats: ModelInfo[],
} }
return missing;
/** Models class used by Human
* - models: record of all GraphModels
* - list: returns list of configured models with their stats
* - loaded: returns array of loaded models
* - reset: unloads all models
* - validate: checks loaded models for valid kernel ops vs current backend
* - stats: live detailed model stats that can be checked during model load phase
*/
export class Models {
instance: Human;
models: Record<string, null | GraphModel>;
constructor(currentInstance: Human) {
this.models = {};
this.instance = currentInstance;
}
stats(): ModelStats {
let totalSizeFromManifest = 0;
let totalSizeWeights = 0;
let totalSizeLoading = 0;
for (const m of Object.values(modelStats)) {
totalSizeFromManifest += m.sizeFromManifest;
totalSizeWeights += m.sizeLoadedWeights;
totalSizeLoading += m.sizeDesired;
}
const percentageLoaded = totalSizeLoading > 0 ? totalSizeWeights / totalSizeLoading : 0;
return {
numLoadedModels: Object.values(modelStats).length,
numDefinedModels: Object.keys(this.models).length,
percentageLoaded,
totalSizeFromManifest,
totalSizeWeights,
totalSizeLoading,
modelStats: Object.values(modelStats),
};
}
reset(): void {
for (const model of Object.keys(this.models)) this.models[model] = null;
}
async load(): Promise<void> {
if (env.initial) this.reset();
const m: Record<string, null | GraphModel | Promise<GraphModel>> = {};
// face main models
m.blazeface = (this.instance.config.face.enabled && !this.models.blazeface) ? blazeface.load(this.instance.config) : null;
m.antispoof = (this.instance.config.face.enabled && this.instance.config.face.antispoof?.enabled && !this.models.antispoof) ? antispoof.load(this.instance.config) : null;
m.liveness = (this.instance.config.face.enabled && this.instance.config.face.liveness?.enabled && !this.models.liveness) ? liveness.load(this.instance.config) : null;
m.faceres = (this.instance.config.face.enabled && this.instance.config.face.description?.enabled && !this.models.faceres) ? faceres.load(this.instance.config) : null;
m.emotion = (this.instance.config.face.enabled && this.instance.config.face.emotion?.enabled && !this.models.emotion) ? emotion.load(this.instance.config) : null;
m.iris = (this.instance.config.face.enabled && this.instance.config.face.iris?.enabled && !this.instance.config.face.attention?.enabled && !this.models.iris) ? iris.load(this.instance.config) : null;
m.facemesh = (this.instance.config.face.enabled && this.instance.config.face.mesh?.enabled && (!this.models.facemesh)) ? facemesh.load(this.instance.config) : null;
// face alternatives
m.gear = (this.instance.config.face.enabled && this.instance.config.face['gear']?.enabled && !this.models.gear) ? gear.load(this.instance.config) : null;
m.ssrnetage = (this.instance.config.face.enabled && this.instance.config.face['ssrnet']?.enabled && !this.models.ssrnetage) ? ssrnetAge.load(this.instance.config) : null;
m.ssrnetgender = (this.instance.config.face.enabled && this.instance.config.face['ssrnet']?.enabled && !this.models.ssrnetgender) ? ssrnetGender.load(this.instance.config) : null;
m.mobilefacenet = (this.instance.config.face.enabled && this.instance.config.face['mobilefacenet']?.enabled && !this.models.mobilefacenet) ? mobilefacenet.load(this.instance.config) : null;
m.insightface = (this.instance.config.face.enabled && this.instance.config.face['insightface']?.enabled && !this.models.insightface) ? insightface.load(this.instance.config) : null;
// body alterinatives
m.blazepose = (this.instance.config.body.enabled && !this.models.blazepose && this.instance.config.body.modelPath?.includes('blazepose')) ? blazepose.loadPose(this.instance.config) : null;
m.blazeposedetect = (this.instance.config.body.enabled && !this.models.blazeposedetect && this.instance.config.body['detector'] && this.instance.config.body['detector'].modelPath) ? blazepose.loadDetect(this.instance.config) : null;
m.efficientpose = (this.instance.config.body.enabled && !this.models.efficientpose && this.instance.config.body.modelPath?.includes('efficientpose')) ? efficientpose.load(this.instance.config) : null;
m.movenet = (this.instance.config.body.enabled && !this.models.movenet && this.instance.config.body.modelPath?.includes('movenet')) ? movenet.load(this.instance.config) : null;
m.posenet = (this.instance.config.body.enabled && !this.models.posenet && this.instance.config.body.modelPath?.includes('posenet')) ? posenet.load(this.instance.config) : null;
// hand alternatives
m.handtrack = (this.instance.config.hand.enabled && !this.models.handtrack && this.instance.config.hand.detector?.modelPath?.includes('handtrack')) ? handtrack.loadDetect(this.instance.config) : null;
m.handskeleton = (this.instance.config.hand.enabled && this.instance.config.hand.landmarks && !this.models.handskeleton && this.instance.config.hand.detector?.modelPath?.includes('handtrack')) ? handtrack.loadSkeleton(this.instance.config) : null;
if (this.instance.config.hand.detector?.modelPath?.includes('handdetect')) [m.handpose, m.handskeleton] = (!this.models.handpose) ? await handpose.load(this.instance.config) : [null, null];
// object detection alternatives
m.centernet = (this.instance.config.object.enabled && !this.models.centernet && this.instance.config.object.modelPath?.includes('centernet')) ? centernet.load(this.instance.config) : null;
m.nanodet = (this.instance.config.object.enabled && !this.models.nanodet && this.instance.config.object.modelPath?.includes('nanodet')) ? nanodet.load(this.instance.config) : null;
// segmentation alternatives
m.selfie = (this.instance.config.segmentation.enabled && !this.models.selfie && this.instance.config.segmentation.modelPath?.includes('selfie')) ? selfie.load(this.instance.config) : null;
m.meet = (this.instance.config.segmentation.enabled && !this.models.meet && this.instance.config.segmentation.modelPath?.includes('meet')) ? meet.load(this.instance.config) : null;
m.rvm = (this.instance.config.segmentation.enabled && !this.models.rvm && this.instance.config.segmentation.modelPath?.includes('rvm')) ? rvm.load(this.instance.config) : null;
// models are loaded in parallel asynchronously so lets wait until they are actually loaded
await Promise.all([...Object.values(m)]);
for (const model of Object.keys(m)) this.models[model] = m[model] as GraphModel || this.models[model] || null; // only update actually loaded models
}
list() {
const models = Object.keys(this.models).map((model) => ({ name: model, loaded: (this.models[model] !== null), size: 0, url: this.models[model] ? this.models[model]?.['modelUrl'] : null }));
for (const m of models) {
const stats = Object.keys(modelStats).find((s) => s.startsWith(m.name));
if (!stats) continue;
m.size = modelStats[stats].sizeLoadedWeights;
m.url = modelStats[stats].url;
}
return models;
}
loaded() {
const list = this.list();
const loaded = list.filter((model) => model.loaded).map((model) => model.name);
return loaded;
}
validate(): { name: string, missing: string[] }[] {
const missing: KernelOps[] = [];
for (const defined of Object.keys(this.models)) {
const model: GraphModel | null = this.models[defined as keyof Models];
if (!model) continue;
const res = validateModel(this.instance, model, defined);
if (res) missing.push(res);
}
return missing;
}
} }

View File

@ -4,7 +4,6 @@ import * as tf from 'dist/tfjs.esm.js';
import type { Human } from '../human'; import type { Human } from '../human';
import { log } from '../util/util'; import { log } from '../util/util';
import * as image from '../image/image'; import * as image from '../image/image';
import * as models from '../models';
import type { AnyCanvas } from '../exports'; import type { AnyCanvas } from '../exports';
export const config = { export const config = {
@ -46,7 +45,7 @@ export function register(instance: Human): void {
if (instance.config.backend !== 'humangl') return; if (instance.config.backend !== 'humangl') return;
if ((config.name in tf.engine().registry) && !config?.gl?.getParameter(config.gl.VERSION)) { if ((config.name in tf.engine().registry) && !config?.gl?.getParameter(config.gl.VERSION)) {
log('humangl error: backend invalid context'); log('humangl error: backend invalid context');
models.reset(instance); instance.models.reset();
/* /*
log('resetting humangl backend'); log('resetting humangl backend');
await tf.removeBackend(config.name); await tf.removeBackend(config.name);

View File

@ -18,6 +18,7 @@ export interface ModelInfo {
sizeDesired: number, sizeDesired: number,
sizeFromManifest: number, sizeFromManifest: number,
sizeLoadedWeights: number, sizeLoadedWeights: number,
url: string,
} }
export const modelStats: Record<string, ModelInfo> = {}; export const modelStats: Record<string, ModelInfo> = {};
@ -45,6 +46,7 @@ export async function loadModel(modelPath: string | undefined): Promise<GraphMod
sizeLoadedWeights: 0, sizeLoadedWeights: 0,
sizeDesired: modelsDefs[shortModelName], sizeDesired: modelsDefs[shortModelName],
inCache: false, inCache: false,
url: '',
}; };
options.cacheSupported = (typeof indexedDB !== 'undefined'); // check if localStorage and indexedb are available options.cacheSupported = (typeof indexedDB !== 'undefined'); // check if localStorage and indexedb are available
let cachedModels = {}; let cachedModels = {};
@ -54,8 +56,9 @@ export async function loadModel(modelPath: string | undefined): Promise<GraphMod
options.cacheSupported = false; options.cacheSupported = false;
} }
modelStats[shortModelName].inCache = (options.cacheSupported && options.cacheModels) && Object.keys(cachedModels).includes(cachedModelName); // is model found in cache modelStats[shortModelName].inCache = (options.cacheSupported && options.cacheModels) && Object.keys(cachedModels).includes(cachedModelName); // is model found in cache
modelStats[shortModelName].url = modelStats[shortModelName].inCache ? cachedModelName : modelUrl;
const tfLoadOptions = typeof fetch === 'undefined' ? {} : { fetchFunc: (url: string, init?: RequestInit) => httpHandler(url, init) }; const tfLoadOptions = typeof fetch === 'undefined' ? {} : { fetchFunc: (url: string, init?: RequestInit) => httpHandler(url, init) };
let model: GraphModel = new tf.GraphModel(modelStats[shortModelName].inCache ? cachedModelName : modelUrl, tfLoadOptions) as unknown as GraphModel; // create model prototype and decide if load from cache or from original modelurl let model: GraphModel = new tf.GraphModel(modelStats[shortModelName].url, tfLoadOptions) as unknown as GraphModel; // create model prototype and decide if load from cache or from original modelurl
let loaded = false; let loaded = false;
try { try {
// @ts-ignore private function // @ts-ignore private function

View File

@ -8,9 +8,9 @@ import * as sample from './sample';
import * as image from './image/image'; import * as image from './image/image';
import * as backend from './tfjs/backend'; import * as backend from './tfjs/backend';
import { env } from './util/env'; import { env } from './util/env';
import type { Config } from './config';
import { emptyResult, Result } from './result'; import { emptyResult, Result } from './result';
import { Human, models } from './human'; import type { Config } from './config';
import type { Human } from './human';
import type { Tensor, DataType } from './tfjs/types'; import type { Tensor, DataType } from './tfjs/types';
async function warmupBitmap(instance: Human): Promise<Result | undefined> { async function warmupBitmap(instance: Human): Promise<Result | undefined> {
@ -161,7 +161,7 @@ export async function warmup(instance: Human, userConfig?: Partial<Config>): Pro
return emptyResult(); return emptyResult();
} }
return new Promise(async (resolve) => { return new Promise(async (resolve) => {
await models.load(instance); await instance.models.load();
await runCompile(instance); await runCompile(instance);
const res = await runInference(instance); const res = await runInference(instance);
const t1 = now(); const t1 = now();

View File

@ -1,105 +1,56 @@
2022-11-17 10:11:23 DATA:  Build {"name":"@vladmandic/human","version":"3.0.0"} 2022-11-17 14:37:08 DATA:  Build {"name":"@vladmandic/human","version":"3.0.0"}
2022-11-17 10:11:23 INFO:  Application: {"name":"@vladmandic/human","version":"3.0.0"} 2022-11-17 14:37:08 INFO:  Application: {"name":"@vladmandic/human","version":"3.0.0"}
2022-11-17 10:11:23 INFO:  Environment: {"profile":"production","config":".build.json","package":"package.json","tsconfig":true,"eslintrc":true,"git":true} 2022-11-17 14:37:08 INFO:  Environment: {"profile":"production","config":".build.json","package":"package.json","tsconfig":true,"eslintrc":true,"git":true}
2022-11-17 10:11:23 INFO:  Toolchain: {"build":"0.7.14","esbuild":"0.15.14","typescript":"4.9.3","typedoc":"0.23.21","eslint":"8.27.0"} 2022-11-17 14:37:08 INFO:  Toolchain: {"build":"0.7.14","esbuild":"0.15.14","typescript":"4.9.3","typedoc":"0.23.21","eslint":"8.27.0"}
2022-11-17 10:11:23 INFO:  Build: {"profile":"production","steps":["clean","compile","typings","typedoc","lint","changelog"]} 2022-11-17 14:37:08 INFO:  Build: {"profile":"production","steps":["clean","compile","typings","typedoc","lint","changelog"]}
2022-11-17 10:11:23 STATE: Clean: {"locations":["dist/*","types/*","typedoc/*"]} 2022-11-17 14:37:08 STATE: Clean: {"locations":["dist/*","types/*","typedoc/*"]}
2022-11-17 10:11:23 STATE: Compile: {"name":"tfjs/browser/version","format":"esm","platform":"browser","input":"tfjs/tf-version.ts","output":"dist/tfjs.version.js","files":1,"inputBytes":1289,"outputBytes":361} 2022-11-17 14:37:08 STATE: Compile: {"name":"tfjs/browser/version","format":"esm","platform":"browser","input":"tfjs/tf-version.ts","output":"dist/tfjs.version.js","files":1,"inputBytes":1289,"outputBytes":361}
2022-11-17 10:11:23 STATE: Compile: {"name":"tfjs/nodejs/cpu","format":"cjs","platform":"node","input":"tfjs/tf-node.ts","output":"dist/tfjs.esm.js","files":2,"inputBytes":569,"outputBytes":924} 2022-11-17 14:37:08 STATE: Compile: {"name":"tfjs/nodejs/cpu","format":"cjs","platform":"node","input":"tfjs/tf-node.ts","output":"dist/tfjs.esm.js","files":2,"inputBytes":569,"outputBytes":924}
2022-11-17 10:11:23 STATE: Compile: {"name":"human/nodejs/cpu","format":"cjs","platform":"node","input":"src/human.ts","output":"dist/human.node.js","files":80,"inputBytes":673583,"outputBytes":317615} 2022-11-17 14:37:08 STATE: Compile: {"name":"human/nodejs/cpu","format":"cjs","platform":"node","input":"src/human.ts","output":"dist/human.node.js","files":80,"inputBytes":670179,"outputBytes":317460}
2022-11-17 10:11:23 STATE: Compile: {"name":"tfjs/nodejs/gpu","format":"cjs","platform":"node","input":"tfjs/tf-node-gpu.ts","output":"dist/tfjs.esm.js","files":2,"inputBytes":577,"outputBytes":928} 2022-11-17 14:37:08 STATE: Compile: {"name":"tfjs/nodejs/gpu","format":"cjs","platform":"node","input":"tfjs/tf-node-gpu.ts","output":"dist/tfjs.esm.js","files":2,"inputBytes":577,"outputBytes":928}
2022-11-17 10:11:23 STATE: Compile: {"name":"human/nodejs/gpu","format":"cjs","platform":"node","input":"src/human.ts","output":"dist/human.node-gpu.js","files":80,"inputBytes":673587,"outputBytes":317619} 2022-11-17 14:37:08 STATE: Compile: {"name":"human/nodejs/gpu","format":"cjs","platform":"node","input":"src/human.ts","output":"dist/human.node-gpu.js","files":80,"inputBytes":670183,"outputBytes":317464}
2022-11-17 10:11:23 STATE: Compile: {"name":"tfjs/nodejs/wasm","format":"cjs","platform":"node","input":"tfjs/tf-node-wasm.ts","output":"dist/tfjs.esm.js","files":2,"inputBytes":665,"outputBytes":1876} 2022-11-17 14:37:08 STATE: Compile: {"name":"tfjs/nodejs/wasm","format":"cjs","platform":"node","input":"tfjs/tf-node-wasm.ts","output":"dist/tfjs.esm.js","files":2,"inputBytes":665,"outputBytes":1876}
2022-11-17 10:11:23 STATE: Compile: {"name":"human/nodejs/wasm","format":"cjs","platform":"node","input":"src/human.ts","output":"dist/human.node-wasm.js","files":80,"inputBytes":674535,"outputBytes":317730} 2022-11-17 14:37:08 STATE: Compile: {"name":"human/nodejs/wasm","format":"cjs","platform":"node","input":"src/human.ts","output":"dist/human.node-wasm.js","files":80,"inputBytes":671131,"outputBytes":317575}
2022-11-17 10:11:23 STATE: Compile: {"name":"tfjs/browser/esm/nobundle","format":"esm","platform":"browser","input":"tfjs/tf-browser.ts","output":"dist/tfjs.esm.js","files":2,"inputBytes":1375,"outputBytes":670} 2022-11-17 14:37:08 STATE: Compile: {"name":"tfjs/browser/esm/nobundle","format":"esm","platform":"browser","input":"tfjs/tf-browser.ts","output":"dist/tfjs.esm.js","files":2,"inputBytes":1375,"outputBytes":670}
2022-11-17 10:11:23 STATE: Compile: {"name":"human/browser/esm/nobundle","format":"esm","platform":"browser","input":"src/human.ts","output":"dist/human.esm-nobundle.js","files":80,"inputBytes":673329,"outputBytes":316181} 2022-11-17 14:37:08 STATE: Compile: {"name":"human/browser/esm/nobundle","format":"esm","platform":"browser","input":"src/human.ts","output":"dist/human.esm-nobundle.js","files":80,"inputBytes":669925,"outputBytes":316039}
2022-11-17 10:11:23 STATE: Compile: {"name":"tfjs/browser/esm/bundle","format":"esm","platform":"browser","input":"tfjs/tf-browser.ts","output":"dist/tfjs.esm.js","files":10,"inputBytes":1375,"outputBytes":1144900} 2022-11-17 14:37:08 STATE: Compile: {"name":"tfjs/browser/esm/bundle","format":"esm","platform":"browser","input":"tfjs/tf-browser.ts","output":"dist/tfjs.esm.js","files":10,"inputBytes":1375,"outputBytes":1144900}
2022-11-17 10:11:23 STATE: Compile: {"name":"human/browser/iife/bundle","format":"iife","platform":"browser","input":"src/human.ts","output":"dist/human.js","files":80,"inputBytes":1817559,"outputBytes":1457643} 2022-11-17 14:37:08 STATE: Compile: {"name":"human/browser/iife/bundle","format":"iife","platform":"browser","input":"src/human.ts","output":"dist/human.js","files":80,"inputBytes":1814155,"outputBytes":1457353}
2022-11-17 10:11:23 STATE: Compile: {"name":"human/browser/esm/bundle","format":"esm","platform":"browser","input":"src/human.ts","output":"dist/human.esm.js","files":80,"inputBytes":1817559,"outputBytes":1917209} 2022-11-17 14:37:09 STATE: Compile: {"name":"human/browser/esm/bundle","format":"esm","platform":"browser","input":"src/human.ts","output":"dist/human.esm.js","files":80,"inputBytes":1814155,"outputBytes":1914737}
2022-11-17 10:11:27 STATE: Typings: {"input":"src/human.ts","output":"types/lib","files":15} 2022-11-17 14:37:12 STATE: Typings: {"input":"src/human.ts","output":"types/lib","files":15}
2022-11-17 10:11:29 STATE: TypeDoc: {"input":"src/human.ts","output":"typedoc","objects":77,"generated":true} 2022-11-17 14:37:14 STATE: TypeDoc: {"input":"src/human.ts","output":"typedoc","objects":77,"generated":true}
2022-11-17 10:11:29 STATE: Compile: {"name":"demo/typescript","format":"esm","platform":"browser","input":"demo/typescript/index.ts","output":"demo/typescript/index.js","files":1,"inputBytes":6136,"outputBytes":2914} 2022-11-17 14:37:14 STATE: Compile: {"name":"demo/typescript","format":"esm","platform":"browser","input":"demo/typescript/index.ts","output":"demo/typescript/index.js","files":1,"inputBytes":6135,"outputBytes":2913}
2022-11-17 10:11:29 STATE: Compile: {"name":"demo/faceid","format":"esm","platform":"browser","input":"demo/faceid/index.ts","output":"demo/faceid/index.js","files":2,"inputBytes":17174,"outputBytes":9251} 2022-11-17 14:37:14 STATE: Compile: {"name":"demo/faceid","format":"esm","platform":"browser","input":"demo/faceid/index.ts","output":"demo/faceid/index.js","files":2,"inputBytes":17572,"outputBytes":9456}
2022-11-17 10:11:37 STATE: Lint: {"locations":["*.json","src/**/*.ts","test/**/*.js","demo/**/*.js"],"files":114,"errors":0,"warnings":0} 2022-11-17 14:37:22 STATE: Lint: {"locations":["*.json","src/**/*.ts","test/**/*.js","demo/**/*.js"],"files":114,"errors":0,"warnings":1}
2022-11-17 10:11:37 STATE: ChangeLog: {"repository":"https://github.com/vladmandic/human","branch":"main","output":"CHANGELOG.md"} 2022-11-17 14:37:22 WARN: 
2022-11-17 10:11:37 STATE: Copy: {"input":"node_modules/@vladmandic/tfjs/types/tfjs-core.d.ts","output":"types/tfjs-core.d.ts"} /home/vlado/dev/human/src/human.ts
2022-11-17 10:11:37 INFO:  Done... 42:17 warning 'DrawOptions' is defined but never used @typescript-eslint/no-unused-vars
2022-11-17 10:11:37 STATE: Copy: {"input":"node_modules/@vladmandic/tfjs/types/tfjs.d.ts","output":"types/tfjs.esm.d.ts"}
2022-11-17 10:11:37 STATE: Copy: {"input":"src/types/tsconfig.json","output":"types/tsconfig.json"} ✖ 1 problem (0 errors, 1 warning)
2022-11-17 10:11:37 STATE: Copy: {"input":"src/types/eslint.json","output":"types/.eslintrc.json"}
2022-11-17 10:11:37 STATE: Copy: {"input":"src/types/tfjs.esm.d.ts","output":"dist/tfjs.esm.d.ts"} 2022-11-17 14:37:22 STATE: ChangeLog: {"repository":"https://github.com/vladmandic/human","branch":"main","output":"CHANGELOG.md"}
2022-11-17 10:11:37 STATE: Filter: {"input":"types/tfjs-core.d.ts"} 2022-11-17 14:37:22 STATE: Copy: {"input":"node_modules/@vladmandic/tfjs/types/tfjs-core.d.ts","output":"types/tfjs-core.d.ts"}
2022-11-17 10:11:38 STATE: API-Extractor: {"succeeeded":true,"errors":0,"warnings":195} 2022-11-17 14:37:22 INFO:  Done...
2022-11-17 10:11:38 STATE: Filter: {"input":"types/human.d.ts"} 2022-11-17 14:37:22 STATE: Copy: {"input":"node_modules/@vladmandic/tfjs/types/tfjs.d.ts","output":"types/tfjs.esm.d.ts"}
2022-11-17 10:11:38 STATE: Write: {"output":"dist/human.esm-nobundle.d.ts"} 2022-11-17 14:37:22 STATE: Copy: {"input":"src/types/tsconfig.json","output":"types/tsconfig.json"}
2022-11-17 10:11:38 STATE: Write: {"output":"dist/human.esm.d.ts"} 2022-11-17 14:37:22 STATE: Copy: {"input":"src/types/eslint.json","output":"types/.eslintrc.json"}
2022-11-17 10:11:38 STATE: Write: {"output":"dist/human.d.ts"} 2022-11-17 14:37:22 STATE: Copy: {"input":"src/types/tfjs.esm.d.ts","output":"dist/tfjs.esm.d.ts"}
2022-11-17 10:11:38 STATE: Write: {"output":"dist/human.node-gpu.d.ts"} 2022-11-17 14:37:22 STATE: Filter: {"input":"types/tfjs-core.d.ts"}
2022-11-17 10:11:38 STATE: Write: {"output":"dist/human.node.d.ts"} 2022-11-17 14:37:23 STATE: API-Extractor: {"succeeeded":true,"errors":0,"warnings":195}
2022-11-17 10:11:38 STATE: Write: {"output":"dist/human.node-wasm.d.ts"} 2022-11-17 14:37:23 STATE: Filter: {"input":"types/human.d.ts"}
2022-11-17 10:11:38 INFO:  Analyze models: {"folders":8,"result":"models/models.json"} 2022-11-17 14:37:23 STATE: Write: {"output":"dist/human.esm-nobundle.d.ts"}
2022-11-17 10:11:38 STATE: Models {"folder":"./models","models":12} 2022-11-17 14:37:23 STATE: Write: {"output":"dist/human.esm.d.ts"}
2022-11-17 10:11:38 STATE: Models {"folder":"../human-models/models","models":43} 2022-11-17 14:37:23 STATE: Write: {"output":"dist/human.d.ts"}
2022-11-17 10:11:38 STATE: Models {"folder":"../blazepose/model/","models":4} 2022-11-17 14:37:23 STATE: Write: {"output":"dist/human.node-gpu.d.ts"}
2022-11-17 10:11:38 STATE: Models {"folder":"../anti-spoofing/model","models":1} 2022-11-17 14:37:23 STATE: Write: {"output":"dist/human.node.d.ts"}
2022-11-17 10:11:38 STATE: Models {"folder":"../efficientpose/models","models":3} 2022-11-17 14:37:23 STATE: Write: {"output":"dist/human.node-wasm.d.ts"}
2022-11-17 10:11:38 STATE: Models {"folder":"../insightface/models","models":5} 2022-11-17 14:37:23 INFO:  Analyze models: {"folders":8,"result":"models/models.json"}
2022-11-17 10:11:38 STATE: Models {"folder":"../movenet/models","models":3} 2022-11-17 14:37:23 STATE: Models {"folder":"./models","models":12}
2022-11-17 10:11:38 STATE: Models {"folder":"../nanodet/models","models":4} 2022-11-17 14:37:23 STATE: Models {"folder":"../human-models/models","models":43}
2022-11-17 10:11:39 STATE: Models: {"count":58,"totalSize":386543911} 2022-11-17 14:37:23 STATE: Models {"folder":"../blazepose/model/","models":4}
2022-11-17 10:11:39 INFO:  Human Build complete... {"logFile":"test/build.log"} 2022-11-17 14:37:23 STATE: Models {"folder":"../anti-spoofing/model","models":1}
2022-11-17 10:16:08 INFO:  @vladmandic/human version 3.0.0 2022-11-17 14:37:23 STATE: Models {"folder":"../efficientpose/models","models":3}
2022-11-17 10:16:08 INFO:  User: vlado Platform: linux Arch: x64 Node: v19.1.0 2022-11-17 14:37:23 STATE: Models {"folder":"../insightface/models","models":5}
2022-11-17 10:16:08 INFO:  Application: {"name":"@vladmandic/human","version":"3.0.0"} 2022-11-17 14:37:23 STATE: Models {"folder":"../movenet/models","models":3}
2022-11-17 10:16:08 INFO:  Environment: {"profile":"development","config":".build.json","package":"package.json","tsconfig":true,"eslintrc":true,"git":true} 2022-11-17 14:37:23 STATE: Models {"folder":"../nanodet/models","models":4}
2022-11-17 10:16:08 INFO:  Toolchain: {"build":"0.7.14","esbuild":"0.15.14","typescript":"4.9.3","typedoc":"0.23.21","eslint":"8.27.0"} 2022-11-17 14:37:24 STATE: Models: {"count":58,"totalSize":386543911}
2022-11-17 10:16:08 INFO:  Build: {"profile":"development","steps":["serve","watch","compile"]} 2022-11-17 14:37:24 INFO:  Human Build complete... {"logFile":"test/build.log"}
2022-11-17 10:16:08 STATE: WebServer: {"ssl":false,"port":8000,"root":"."}
2022-11-17 10:16:08 STATE: WebServer: {"ssl":true,"port":8001,"root":".","sslKey":"node_modules/@vladmandic/build/cert/https.key","sslCrt":"node_modules/@vladmandic/build/cert/https.crt"}
2022-11-17 10:16:08 STATE: Watch: {"locations":["src/**/*","tfjs/**/*","demo/**/*.ts"]}
2022-11-17 10:16:08 STATE: Compile: {"name":"tfjs/browser/version","format":"esm","platform":"browser","input":"tfjs/tf-version.ts","output":"dist/tfjs.version.js","files":1,"inputBytes":1289,"outputBytes":1357}
2022-11-17 10:16:08 STATE: Compile: {"name":"tfjs/nodejs/cpu","format":"cjs","platform":"node","input":"tfjs/tf-node.ts","output":"dist/tfjs.esm.js","files":2,"inputBytes":1565,"outputBytes":1786}
2022-11-17 10:16:08 STATE: Compile: {"name":"human/nodejs/cpu","format":"cjs","platform":"node","input":"src/human.ts","output":"dist/human.node.js","files":80,"inputBytes":674445,"outputBytes":507569}
2022-11-17 10:16:08 STATE: Compile: {"name":"tfjs/nodejs/gpu","format":"cjs","platform":"node","input":"tfjs/tf-node-gpu.ts","output":"dist/tfjs.esm.js","files":2,"inputBytes":1573,"outputBytes":1810}
2022-11-17 10:16:08 STATE: Compile: {"name":"human/nodejs/gpu","format":"cjs","platform":"node","input":"src/human.ts","output":"dist/human.node-gpu.js","files":80,"inputBytes":674469,"outputBytes":507589}
2022-11-17 10:16:08 STATE: Compile: {"name":"tfjs/nodejs/wasm","format":"cjs","platform":"node","input":"tfjs/tf-node-wasm.ts","output":"dist/tfjs.esm.js","files":2,"inputBytes":1661,"outputBytes":1992}
2022-11-17 10:16:08 STATE: Compile: {"name":"human/nodejs/wasm","format":"cjs","platform":"node","input":"src/human.ts","output":"dist/human.node-wasm.js","files":80,"inputBytes":674651,"outputBytes":507780}
2022-11-17 10:16:08 STATE: Compile: {"name":"tfjs/browser/esm/nobundle","format":"esm","platform":"browser","input":"tfjs/tf-browser.ts","output":"dist/tfjs.esm.js","files":2,"inputBytes":2371,"outputBytes":923}
2022-11-17 10:16:08 STATE: Compile: {"name":"human/browser/esm/nobundle","format":"esm","platform":"browser","input":"src/human.ts","output":"dist/human.esm-nobundle.js","files":80,"inputBytes":673582,"outputBytes":510177}
2022-11-17 10:16:09 STATE: Compile: {"name":"tfjs/browser/esm/bundle","format":"esm","platform":"browser","input":"tfjs/tf-browser.ts","output":"dist/tfjs.esm.js","files":10,"inputBytes":2371,"outputBytes":1144900}
2022-11-17 10:16:09 STATE: Compile: {"name":"human/browser/iife/bundle","format":"iife","platform":"browser","input":"src/human.ts","output":"dist/human.js","files":80,"inputBytes":1817559,"outputBytes":1457643}
2022-11-17 10:16:09 STATE: Compile: {"name":"human/browser/esm/bundle","format":"esm","platform":"browser","input":"src/human.ts","output":"dist/human.esm.js","files":80,"inputBytes":1817559,"outputBytes":1917209}
2022-11-17 10:16:09 STATE: Compile: {"name":"demo/typescript","format":"esm","platform":"browser","input":"demo/typescript/index.ts","output":"demo/typescript/index.js","files":1,"inputBytes":6136,"outputBytes":4208}
2022-11-17 10:16:09 STATE: Compile: {"name":"demo/faceid","format":"esm","platform":"browser","input":"demo/faceid/index.ts","output":"demo/faceid/index.js","files":2,"inputBytes":17567,"outputBytes":13914}
2022-11-17 10:16:09 INFO:  Listening...
2022-11-17 10:16:20 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":301,"url":"/demo/typescript","redirect":"/demo/typescript/index.html","remote":"::1"}
2022-11-17 10:16:20 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"text/html","size":1953,"url":"/demo/typescript/index.html","remote":"::1"}
2022-11-17 10:16:20 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"text/javascript","size":4208,"url":"/demo/typescript/index.js","remote":"::1"}
2022-11-17 10:16:22 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":301,"url":"/demo/faceid","redirect":"/demo/faceid/index.html","remote":"::1"}
2022-11-17 10:16:22 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"text/html","size":3415,"url":"/demo/faceid/index.html","remote":"::1"}
2022-11-17 10:16:22 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"text/javascript","size":13914,"url":"/demo/faceid/index.js","remote":"::1"}
2022-11-17 10:16:22 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"text/javascript","size":1917209,"url":"/dist/human.esm.js","remote":"::1"}
2022-11-17 10:16:22 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"font/woff2","size":181500,"url":"/assets/lato-light.woff2","remote":"::1"}
2022-11-17 10:16:22 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"application/manifest+json","size":304,"url":"/demo/manifest.webmanifest","remote":"::1"}
2022-11-17 10:16:22 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"image/png","size":142790,"url":"/assets/icon.png","remote":"::1"}
2022-11-17 10:16:23 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"application/manifest+json","size":304,"url":"/demo/manifest.webmanifest","remote":"::1"}
2022-11-17 10:16:25 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"application/octet-stream","size":28470,"url":"/demo/faceid/index.js.map","remote":"::1"}
2022-11-17 10:16:25 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"application/octet-stream","size":3385692,"url":"/dist/human.esm.js.map","remote":"::1"}
2022-11-17 10:16:39 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"text/html","size":3415,"url":"/demo/faceid/index.html","remote":"::1"}
2022-11-17 10:16:39 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"text/javascript","size":13914,"url":"/demo/faceid/index.js","remote":"::1"}
2022-11-17 10:16:39 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"font/woff2","size":181500,"url":"/assets/lato-light.woff2","remote":"::1"}
2022-11-17 10:16:39 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"text/javascript","size":1917209,"url":"/dist/human.esm.js","remote":"::1"}
2022-11-17 10:16:39 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"application/octet-stream","size":28470,"url":"/demo/faceid/index.js.map","remote":"::1"}
2022-11-17 10:16:39 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"application/octet-stream","size":3385692,"url":"/dist/human.esm.js.map","remote":"::1"}
2022-11-17 10:16:39 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"image/x-icon","size":261950,"url":"/favicon.ico","remote":"::1"}
2022-11-17 10:16:39 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"application/manifest+json","size":304,"url":"/demo/manifest.webmanifest","remote":"::1"}
2022-11-17 10:16:39 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"image/png","size":142790,"url":"/assets/icon.png","remote":"::1"}
2022-11-17 10:17:25 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":301,"url":"/demo/typescript","redirect":"/demo/typescript/index.html","remote":"::1"}
2022-11-17 10:17:25 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"text/html","size":1953,"url":"/demo/typescript/index.html","remote":"::1"}
2022-11-17 10:17:25 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"text/javascript","size":4208,"url":"/demo/typescript/index.js","remote":"::1"}
2022-11-17 10:17:25 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"font/woff2","size":181500,"url":"/assets/lato-light.woff2","remote":"::1"}
2022-11-17 10:17:25 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"text/javascript","size":1917209,"url":"/dist/human.esm.js","remote":"::1"}
2022-11-17 10:17:25 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"application/octet-stream","size":9470,"url":"/demo/typescript/index.js.map","remote":"::1"}
2022-11-17 10:17:25 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"application/octet-stream","size":3385692,"url":"/dist/human.esm.js.map","remote":"::1"}
2022-11-17 10:17:25 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"application/manifest+json","size":304,"url":"/demo/manifest.webmanifest","remote":"::1"}
2022-11-17 10:17:25 DATA:  HTTPS: {"method":"GET","ver":"2.0","status":200,"mime":"image/png","size":142790,"url":"/assets/icon.png","remote":"::1"}

View File

@ -75,7 +75,7 @@ async function testDefault(title, testConfig = {}) {
await human.load(); await human.load();
const models = Object.keys(human.models).map((model) => ({ name: model, loaded: (human.models[model] !== null) })); const models = Object.keys(human.models).map((model) => ({ name: model, loaded: (human.models[model] !== null) }));
log(' models', models); log(' models', models);
const ops = await human.check(); const ops = await human.models.validate();
if (ops && ops.length > 0) log(' missing ops', ops); if (ops && ops.length > 0) log(' missing ops', ops);
const img = await image('../../samples/in/ai-body.jpg'); const img = await image('../../samples/in/ai-body.jpg');
const input = await human.image(img); // process image const input = await human.image(img); // process image
@ -108,7 +108,7 @@ async function testMatch() {
const similarity = await human.similarity(desc1, desc2); const similarity = await human.similarity(desc1, desc2);
const descArray = []; const descArray = [];
for (let i = 0; i < 100; i++) descArray.push(desc2); for (let i = 0; i < 100; i++) descArray.push(desc2);
const match = await human.match(desc1, descArray); const match = await human.match.find(desc1, descArray);
log(`test similarity/${human.tf.getBackend()}`, match, similarity); log(`test similarity/${human.tf.getBackend()}`, match, similarity);
} }

View File

@ -9,22 +9,21 @@ const log = (status, ...data) => {
async function main() { async function main() {
const human = new Human.Human(); // create instance of human using default configuration const human = new Human.Human(); // create instance of human using default configuration
const startTime = new Date(); const startTime = new Date();
log('info', 'load start', { human: human.version, tf: tf.version_core, progress: human.getModelStats().percentageLoaded }); log('info', 'load start', { human: human.version, tf: tf.version_core, progress: human.models.stats().percentageLoaded });
async function monitor() { async function monitor() {
const progress = human.getModelStats().percentageLoaded; const progress = human.models.stats().percentageLoaded;
log('data', 'load interval', { elapsed: new Date() - startTime, progress }); log('data', 'load interval', { elapsed: new Date() - startTime, progress });
if (progress < 1) setTimeout(monitor, 10); if (progress < 1) setTimeout(monitor, 10);
} }
monitor(); monitor();
// setInterval(() => log('interval', { elapsed: new Date() - startTime, progress: human.getModelStats().percentageLoaded }));
const loadPromise = human.load(); const loadPromise = human.load();
loadPromise loadPromise
.then(() => log('state', 'passed', { progress: human.getModelStats().percentageLoaded })) .then(() => log('state', 'passed', { progress: human.models.stats().percentageLoaded }))
.catch(() => log('error', 'load promise')); .catch(() => log('error', 'load promise'));
await loadPromise; await loadPromise;
log('info', 'load final', { progress: human.getModelStats().percentageLoaded }); log('info', 'load final', { progress: human.models.stats().percentageLoaded });
await human.warmup(); // optional as model warmup is performed on-demand first time its executed await human.warmup(); // optional as model warmup is performed on-demand first time its executed
} }

View File

@ -138,7 +138,7 @@ async function testDetect(human, input, title, checkLeak = true) {
lastOp = `testDetect ${title}`; lastOp = `testDetect ${title}`;
log('state', 'start', title); log('state', 'start', title);
await human.load(config); await human.load(config);
const missing = human.check(); const missing = human.models.validate();
for (const op of missing) log('warn', 'missing kernel ops', { title, model: op.name, url: op.url, missing: op.missing, backkend: human.tf.getBackend() }); for (const op of missing) log('warn', 'missing kernel ops', { title, model: op.name, url: op.url, missing: op.missing, backkend: human.tf.getBackend() });
const tensors = human.tf.engine().state.numTensors; const tensors = human.tf.engine().state.numTensors;
const image = input ? await getImage(human, input) : human.tf.randomNormal([1, 1024, 1024, 3]); const image = input ? await getImage(human, input) : human.tf.randomNormal([1, 1024, 1024, 3]);
@ -189,7 +189,7 @@ async function verifyDetails(human) {
verify(res.face.length === 1, 'details face length', res.face.length); verify(res.face.length === 1, 'details face length', res.face.length);
for (const face of res.face) { for (const face of res.face) {
verify(face.score > 0.9 && face.boxScore > 0.9 && face.faceScore > 0.9, 'details face score', face.score, face.boxScore, face.faceScore); verify(face.score > 0.9 && face.boxScore > 0.9 && face.faceScore > 0.9, 'details face score', face.score, face.boxScore, face.faceScore);
verify(face.age > 23 && face.age < 30 && face.gender === 'female' && face.genderScore > 0.9 && face.iris > 0.5 && face.distance < 2.5, 'details face age/gender', face.age, face.gender, face.genderScore, face.distance); verify(face.age > 23 && face.age < 30 && face.gender === 'female' && face.genderScore > 0.9 && face.distance > 0.5 && face.distance < 2.5, 'details face age/gender', face.age, face.gender, face.genderScore, face.distance);
verify(face.box.length === 4 && face.boxRaw.length === 4 && face.mesh.length === 478 && face.meshRaw.length === 478 && face.embedding.length === 1024, 'details face arrays', face.box.length, face.mesh.length, face.embedding.length); verify(face.box.length === 4 && face.boxRaw.length === 4 && face.mesh.length === 478 && face.meshRaw.length === 478 && face.embedding.length === 1024, 'details face arrays', face.box.length, face.mesh.length, face.embedding.length);
verify(face.emotion.length >= 2 && face.emotion[0].score > 0.30 && face.emotion[0].emotion === 'angry', 'details face emotion', face.emotion.length, face.emotion[0]); verify(face.emotion.length >= 2 && face.emotion[0].score > 0.30 && face.emotion[0].emotion === 'angry', 'details face emotion', face.emotion.length, face.emotion[0]);
verify(face.real > 0.55, 'details face anti-spoofing', face.real); verify(face.real > 0.55, 'details face anti-spoofing', face.real);
@ -293,9 +293,9 @@ async function test(Human, inputConfig) {
// test model loading // test model loading
log('info', 'test: model load'); log('info', 'test: model load');
await human.load(); await human.load();
const models = Object.keys(human.models).map((model) => ({ name: model, loaded: (human.models[model] !== null), url: human.models[model] ? human.models[model]['modelUrl'] : null })); const models = human.models.list();
const loaded = models.filter((model) => model.loaded); const loaded = human.models.loaded();
if (models.length === 25 && loaded.length === 11) log('state', 'passed: models loaded', models.length, loaded.length, models); if (models.length === 24 && loaded.length === 11) log('state', 'passed: models loaded', models.length, loaded.length, models);
else log('error', 'failed: models loaded', models.length, loaded.length, models); else log('error', 'failed: models loaded', models.length, loaded.length, models);
log('info', 'memory:', { memory: human.tf.memory() }); log('info', 'memory:', { memory: human.tf.memory() });
log('info', 'state:', { state: human.tf.engine().state }); log('info', 'state:', { state: human.tf.engine().state });
@ -380,15 +380,15 @@ async function test(Human, inputConfig) {
const desc3 = res3 && res3.face && res3.face[0] && res3.face[0].embedding ? [...res3.face[0].embedding] : null; const desc3 = res3 && res3.face && res3.face[0] && res3.face[0].embedding ? [...res3.face[0].embedding] : null;
if (!desc1 || !desc2 || !desc3 || desc1.length !== 1024 || desc2.length !== 1024 || desc3.length !== 1024) log('error', 'failed: face descriptor', desc1?.length, desc2?.length, desc3?.length); if (!desc1 || !desc2 || !desc3 || desc1.length !== 1024 || desc2.length !== 1024 || desc3.length !== 1024) log('error', 'failed: face descriptor', desc1?.length, desc2?.length, desc3?.length);
else log('state', 'passed: face descriptor'); else log('state', 'passed: face descriptor');
res1 = human.similarity(desc1, desc1); res1 = human.match.similarity(desc1, desc1);
res2 = human.similarity(desc1, desc2); res2 = human.match.similarity(desc1, desc2);
res3 = human.similarity(desc1, desc3); res3 = human.match.similarity(desc1, desc3);
if (res1 < 1 || res2 < 0.40 || res3 < 0.40 || res2 > 0.75 || res3 > 0.75) log('error', 'failed: face similarity', { similarity: [res1, res2, res3], descriptors: [desc1?.length, desc2?.length, desc3?.length] }); if (res1 < 1 || res2 < 0.40 || res3 < 0.40 || res2 > 0.75 || res3 > 0.75) log('error', 'failed: face similarity', { similarity: [res1, res2, res3], descriptors: [desc1?.length, desc2?.length, desc3?.length] });
else log('state', 'passed: face similarity', { similarity: [res1, res2, res3], descriptors: [desc1?.length, desc2?.length, desc3?.length] }); else log('state', 'passed: face similarity', { similarity: [res1, res2, res3], descriptors: [desc1?.length, desc2?.length, desc3?.length] });
// test object detection // test object detection
log('info', 'test object'); log('info', 'test object');
config.object = { enabled: true, modelPath: 'mb3-centernet.json' }; config.object = { enabled: true, modelPath: 'centernet.json' };
res = await testDetect(human, 'samples/in/ai-body.jpg', 'object'); res = await testDetect(human, 'samples/in/ai-body.jpg', 'object');
if (!res || res.object?.length < 1 || res.object[0]?.label !== 'person') log('error', 'failed: centernet', res.object); if (!res || res.object?.length < 1 || res.object[0]?.label !== 'person') log('error', 'failed: centernet', res.object);
else log('state', 'passed: centernet'); else log('state', 'passed: centernet');
@ -461,9 +461,9 @@ async function test(Human, inputConfig) {
const arr = db.map((rec) => rec.embedding); const arr = db.map((rec) => rec.embedding);
if (db.length < 20) log('error', 'failed: face database ', db.length); if (db.length < 20) log('error', 'failed: face database ', db.length);
else log('state', 'passed: face database', db.length); else log('state', 'passed: face database', db.length);
res1 = human.match(desc1, arr); res1 = human.match.find(desc1, arr);
res2 = human.match(desc2, arr); res2 = human.match.find(desc2, arr);
res3 = human.match(desc3, arr); res3 = human.match.find(desc3, arr);
if (res1.index !== 4 || res2.index !== 4 || res3.index !== 4) log('error', 'failed: face match', res1, res2, res3); if (res1.index !== 4 || res2.index !== 4 || res3.index !== 4) log('error', 'failed: face match', res1, res2, res3);
else log('state', 'passed: face match', { first: { index: res1.index, similarity: res1.similarity } }, { second: { index: res2.index, similarity: res2.similarity } }, { third: { index: res3.index, similarity: res3.similarity } }); else log('state', 'passed: face match', { first: { index: res1.index, similarity: res1.similarity } }, { second: { index: res2.index, similarity: res2.similarity } }, { third: { index: res3.index, similarity: res3.similarity } });

File diff suppressed because it is too large Load Diff

2
wiki

@ -1 +1 @@
Subproject commit 93e58e16b579922e2f19bc91c8ead0af0f326f5a Subproject commit 6ea5ea911dcf7ad598c8ee3777b103d7e531fec5