From 4a6f9797e39666d07c8dcb2262c1545f1acb0510 Mon Sep 17 00:00:00 2001 From: Vladimir Mandic Date: Fri, 18 Nov 2022 13:14:10 -0500 Subject: [PATCH] update wiki pages --- Backends.md | 4 +- Build-Process.md | 1 + Config.md | 5 +- Development-Server.md | 1 + Diag.md | 50 +++++++-------- Draw.md | 6 +- Embedding.md | 25 ++++---- Gesture.md | 16 ++--- Home.md | 143 +++++++++++------------------------------- Inputs.md | 7 ++- Install.md | 59 ++++++++++------- Issues.md | 6 +- Module.md | 6 +- Profiling.md | 1 + Result.md | 1 + Usage.md | 54 +++++++++------- 16 files changed, 170 insertions(+), 215 deletions(-) diff --git a/Backends.md b/Backends.md index facc6b3..ac33fbb 100644 --- a/Backends.md +++ b/Backends.md @@ -116,8 +116,8 @@ Cross-Origin-Embedder-Policy: require-corp Or configure `Human` load WASM files directly from a CDN: -```json -wasmPath: 'https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-backend-wasm@3.9.0/dist/' +```js +wasmPath = 'https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-backend-wasm@3.9.0/dist/'; ``` Note that version of WASM binaries **must** match version of TFJS used by `Human` library diff --git a/Build-Process.md b/Build-Process.md index 495fd17..574bf95 100644 --- a/Build-Process.md +++ b/Build-Process.md @@ -40,6 +40,7 @@ Dev build runs following operations: Production build is started by running `npm run build` + ```js 2022-07-18 08:21:08 DATA: Build { name: '@vladmandic/human', version: '2.9.0' } 2022-07-18 08:21:08 INFO: Analyze: { modelsDir: '../human-models/models', modelsOut: 'models/models.json' } diff --git a/Config.md b/Config.md index 4391147..39d4e64 100644 --- a/Config.md +++ b/Config.md @@ -10,6 +10,7 @@ Overview of `Config` object type: + ```ts interface Config { backend: string // backend engine to be used for processing @@ -49,9 +50,9 @@ for example, ```js const myConfig = { - baseModelPath: `https://cdn.jsdelivr.net/npm/@vladmandic/human/models/`, + baseModelPath: 'https://cdn.jsdelivr.net/npm/@vladmandic/human/models/', segmentation: { enabled: true }, -} +}; const human = new Human(myConfig); const result = await human.detect(input); ``` diff --git a/Development-Server.md b/Development-Server.md index 04c6b4c..c41bc1b 100644 --- a/Development-Server.md +++ b/Development-Server.md @@ -18,6 +18,7 @@ By default, secure http2 web server will run on port `10031` and unsecure http s Development environment is started by running `npm run dev` + ```js 2021-09-10 21:03:37 INFO: @vladmandic/human version 2.1.5 2021-09-10 21:03:37 INFO: User: vlado Platform: linux Arch: x64 Node: v16.5.0 diff --git a/Diag.md b/Diag.md index 2efbb0d..54e68fe 100644 --- a/Diag.md +++ b/Diag.md @@ -10,26 +10,25 @@ console.log(human.version); ## Enable console debug output ```js -const human = new Human({ debug: true }) +const human = new Human({ debug: true }); ``` ## Get current configuration ```js -console.log(human.config) +console.log(human.config); ``` -```js +```json { - backend: 'tensorflow', - modelBasePath: 'file://models/', - wasmPath: 'https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-backend-wasm@3.9.0/dist/', - debug: true, - ... + "backend": "tensorflow", + "modelBasePath": "file://models/", + "wasmPath": "https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-backend-wasm@3.9.0/dist/", + "debug": true } ``` ## Get current environment details ```js -console.log(human.env) +console.log(human.env); ``` ```json { @@ -43,17 +42,17 @@ console.log(human.env) "wasm": {"supported":true,"simd":true,"multithread":true}, "webgl": {"supported":true,"version":"WebGL 2.0 (OpenGL ES 3.0 Chromium)","renderer":"WebKit WebGL"}, "webgpu": {"supported":true,"adapter":"Default"}, - "kernels": [...] + "kernels": [] } ``` ## Get list of all models ```js - const models = Object.keys(human.models).map((model) => ({ name: model, loaded: (human.models[model] !== null) })); - console.log(models); +const models = Object.keys(human.models).map((model) => ({ name: model, loaded: (human.models[model] !== null) })); +console.log(models); ``` ```js -[ +models = [ { name: 'face', loaded: true }, { name: 'posenet', loaded: false }, { name: 'blazepose', loaded: false }, @@ -68,41 +67,36 @@ console.log(human.env) { name: 'centernet', loaded: false }, { name: 'faceres', loaded: true }, { name: 'segmentation', loaded: false }, -] +]; ``` ## Get memory usage information ```js - console.log(human.tf.engine().memory())); +console.log(human.tf.engine().memory()); ``` ```js -{ - numTensors: 1053, numDataBuffers: 1053, numBytes: 42736024 -} +memory = { numTensors: 1053, numDataBuffers: 1053, numBytes: 42736024 }; ``` ## Get current TensorFlow flags ```js - console.log(human.tf.ENV.flags); +console.log(human.tf.ENV.flags); ``` ```js -{ - DEBUG: false, PROD: true, CPU_HANDOFF_SIZE_THRESHOLD: 128 -} +flags = { DEBUG: false, PROD: true, CPU_HANDOFF_SIZE_THRESHOLD: 128 }; ``` ## Get performance information ```js - const result = await human.detect(input); - console.log(result.performance); +const result = await human.detect(input); +console.log(result.performance); ``` ```js - { - backend: 1, load: 283, image: 1, frames: 1, cached: 0, changed: 1, total: 947, draw: 0, - face: 390, emotion: 15, embedding: 97, body: 97, hand: 142, object: 312, gesture: 0 - } +performance = { + backend: 1, load: 283, image: 1, frames: 1, cached: 0, changed: 1, total: 947, draw: 0, face: 390, emotion: 15, embedding: 97, body: 97, hand: 142, object: 312, gesture: 0, +}; ``` ## All possible fatal errors diff --git a/Draw.md b/Draw.md index faef0a1..41accea 100644 --- a/Draw.md +++ b/Draw.md @@ -27,6 +27,7 @@ If `options.drawLabels` is enabled (default) ## Default Label Templates ```js +drawOptions = { faceLabels: `face confidence: [score]% [gender] [genderScore]% @@ -43,15 +44,14 @@ If `options.drawLabels` is enabled (default) handLabels: '[label] [score]%', fingerLabels: '[label]', gestureLabels: '[where] [who]: [what]', +}; ``` ## Example Example of custom labels: ```js -import * as Human from '@vladmandic/human'; -... -const drawOptions: Partial = { +const drawOptions = { bodyLabels: `person confidence is [score]% and has ${human.result?.body?.[0]?.keypoints.length || 'no'} keypoints`, }; human.draw.all(dom.canvas, human.result, drawOptions); diff --git a/Embedding.md b/Embedding.md index 994ffdb..46850fb 100644 --- a/Embedding.md +++ b/Embedding.md @@ -81,6 +81,7 @@ Similarity function is based on general *Minkowski distance* between all points Changing `order` can make similarity matching more or less sensitive (default order is 2nd order) For example, those will produce slighly different results: + ```js const similarity2ndOrder = human.match.similarity(firstEmbedding, secondEmbedding, { order = 2 }); const similarity3rdOrder = human.match.similarity(firstEmbedding, secondEmbedding, { order = 3 }); @@ -96,9 +97,9 @@ to be used at the later time to find the best match for any given face For example: ```js - const db = []; - const res = await human.detect(input); - db.push({ label: 'this-is-me', embedding: res.face[0].embedding }); +const db = []; +const res = await human.detect(input); +db.push({ label: 'this-is-me', embedding: res.face[0].embedding }); ``` Note that you can have multiple entries for the same person and best match will be used @@ -106,10 +107,10 @@ Note that you can have multiple entries for the same person and best match will To find the best match, simply use `match` method while providing embedding descriptor to compare and pre-prepared list of descriptors ```js - const embeddingArray = db.map((record) => record.embedding); // build array with just embeddings - const best = human.match.find(embedding, embeddingArray); // return is object: { index: number, similarity: number, distance: number } - const label = embeddingArray[best.index].label; - console.log({ name, similarity: best.similarity }); +const embeddingArray = db.map((record) => record.embedding); // build array with just embeddings +const best = human.match.find(embedding, embeddingArray); // return is object: { index: number, similarity: number, distance: number } +const label = embeddingArray[best.index].label; +console.log({ name, similarity: best.similarity }); ``` Database can be further stored in a JS or JSON file and retrieved when needed to have @@ -118,14 +119,14 @@ For example, see `/demo/facematch/facematch.js` and example database `/demo/face > download db with known faces using http/https ```js - let res = await fetch('/demo/facematch/faces.json'); - db = (res && res.ok) ? await res.json() : []; +const res = await fetch('/demo/facematch/faces.json'); +db = (res && res.ok) ? await res.json() : []; ``` > download db with known faces from a local file ```js - const fs = require('fs'); - const buffer = fs.readFileSync('/demo/facematch/faces.json'); - db = JSON.parse(buffer); +const fs = require('fs'); +const buffer = fs.readFileSync('/demo/facematch/faces.json'); +db = JSON.parse(buffer); ```
diff --git a/Gesture.md b/Gesture.md index 9d23ebb..3bb78bf 100644 --- a/Gesture.md +++ b/Gesture.md @@ -26,14 +26,14 @@ There are three pre-defined methods: Example output of `result.gesture`: ```js -[ - {face: "0", gesture: "facing camera"} - {face: "0", gesture: "head up"} - {iris: "0", gesture: "looking center"} - {body: "0", gesture: "i give up"} - {body: "0", gesture: "leaning left"} - {hand: "0", gesture: "thumb forward middlefinger up"} -] +gesture = [ + { face: '0', gesture: 'facing camera' }, + { face: '0', gesture: 'head up' }, + { iris: '0', gesture: 'looking center' }, + { body: '0', gesture: 'i give up' }, + { body: '0', gesture: 'leaning left' }, + { hand: '0', gesture: 'thumb forward middlefinger up' }, +]; ``` Where number after gesture refers to number of person that detection belongs to in scenes with multiple people. diff --git a/Home.md b/Home.md index 7158fb1..aef63ef 100644 --- a/Home.md +++ b/Home.md @@ -25,6 +25,8 @@ - Simple unified API - Built-in Image, Video and WebCam handling +[*Jump to Quick Start*](#quick-start) +
## Compatibility @@ -138,76 +140,6 @@ *Suggestions are welcome!* -

- -## App Examples - -Visit [Examples gallery](https://vladmandic.github.io/human/samples/index.html) for more examples - - -![samples](assets/samples.jpg) - -
- -## Options - -All options as presented in the demo application... -> [demo/index.html](demo/index.html) - -![Options visible in demo](assets/screenshot-menu.png) - -
- -**Results Browser:** -[ *Demo -> Display -> Show Results* ]
-![Results](assets/screenshot-results.png) - -
- -## Advanced Examples - -1. **Face Similarity Matching:** -Extracts all faces from provided input images, -sorts them by similarity to selected face -and optionally matches detected face with database of known people to guess their names -> [demo/facematch](demo/facematch/index.html) - -![Face Matching](assets/screenshot-facematch.jpg) - -2. **Face ID:** -Performs validation check on a webcam input to detect a real face and matches it to known faces stored in database -> [demo/faceid](demo/faceid/index.html) - -![Face Matching](assets/screenshot-faceid.jpg) - -
- -3. **3D Rendering:** -> [human-motion](https://github.com/vladmandic/human-motion) - -![Face3D](https://github.com/vladmandic/human-motion/raw/main/assets/screenshot-face.jpg) -![Body3D](https://github.com/vladmandic/human-motion/raw/main/assets/screenshot-body.jpg) -![Hand3D](https://github.com/vladmandic/human-motion/raw/main/assets/screenshot-hand.jpg) - -
- -4. **VR Model Tracking:** -> [human-three-vrm](https://github.com/vladmandic/human-three-vrm) -> [human-bjs-vrm](https://github.com/vladmandic/human-bjs-vrm) - -![ThreeVRM](https://github.com/vladmandic/human-three-vrm/raw/main/assets/human-vrm-screenshot.jpg) - - -5. **Human as OS native application:** -> [human-electron](https://github.com/vladmandic/human-electron) - -
- -**468-Point Face Mesh Defails:** -(view in full resolution to see keypoints) - -![FaceMesh](assets/facemesh.png) -


## Quick Start @@ -216,33 +148,16 @@ Simply load `Human` (*IIFE version*) directly from a cloud CDN in your HTML file (pick one: `jsdelirv`, `unpkg` or `cdnjs`) ```html + - + ``` For details, including how to use `Browser ESM` version or `NodeJS` version of `Human`, see [**Installation**](https://github.com/vladmandic/human/wiki/Install)
-## Inputs - -`Human` library can process all known input types: - -- `Image`, `ImageData`, `ImageBitmap`, `Canvas`, `OffscreenCanvas`, `Tensor`, -- `HTMLImageElement`, `HTMLCanvasElement`, `HTMLVideoElement`, `HTMLMediaElement` - -Additionally, `HTMLVideoElement`, `HTMLMediaElement` can be a standard `