Compare commits

..

80 Commits

Author SHA1 Message Date
Vladimir Mandic 189226d63a full rebuild
Signed-off-by: Vladimir Mandic <mandic00@live.com>
2025-02-05 09:15:34 -05:00
Vladimir Mandic f587b44f66 1.7.15 2025-02-05 09:02:09 -05:00
Vladimir Mandic e3f11b8533 update build platform
Signed-off-by: Vladimir Mandic <mandic00@live.com>
2025-02-05 09:02:06 -05:00
Vladimir Mandic 171d17cadf update changelog 2024-09-10 11:31:01 -04:00
Vladimir Mandic e4cdf624c9 update build environment and full rebuild 2024-09-10 11:30:23 -04:00
Vladimir Mandic c633f9fbe4 1.7.14 2024-09-10 11:17:44 -04:00
Vladimir Mandic ffc3c40362 rebuild 2024-01-20 15:46:59 -05:00
Vladimir Mandic a8193f9077
Merge pull request #188 from rebser/master
fixing leaking EventHandlers when using HTMLCanvasElement
2024-01-20 15:45:04 -05:00
rebser 155f07dccd
fixing leaking EventHandlers when using HTMLCanvasElement 2024-01-19 08:38:59 +01:00
Vladimir Mandic 2f0469fe6e update readme 2024-01-17 17:04:22 -05:00
Vladimir Mandic 697b265337 rebuild types 2024-01-17 17:01:20 -05:00
Vladimir Mandic 4719b81587 rebuild 2024-01-17 16:56:53 -05:00
Vladimir Mandic fc9a39ea13 1.7.13 2024-01-17 16:44:28 -05:00
Vladimir Mandic 438897c5a2 update all dependencies 2024-01-17 16:44:24 -05:00
Vladimir Mandic f4d4780267
Merge pull request #186 from khwalkowicz/master
feat: enable noImplicitAny
2024-01-17 16:06:03 -05:00
Kamil H. Walkowicz a5c767fdff feat: enable noImplicitAny 2024-01-16 18:09:52 +01:00
Vladimir Mandic 1fa29b0fd3 update tfjs and rebuild 2023-06-12 12:02:21 -04:00
Vladimir Mandic 472f2e4480 1.7.12 2023-06-12 12:01:45 -04:00
Vladimir Mandic 4433ce44bc update dependencies 2023-05-08 09:08:30 -04:00
Vladimir Mandic 4ca829f941 1.7.11 2023-05-08 09:08:05 -04:00
Vladimir Mandic 038349968c update tfjs 2023-03-21 08:00:18 -04:00
Vladimir Mandic ae96c7b230 1.7.10 2023-03-21 07:59:27 -04:00
Vladimir Mandic f9f036ba01 change typedefs 2023-01-29 10:08:46 -05:00
Vladimir Mandic 0736a99250 1.7.9 2023-01-29 09:00:29 -05:00
Vladimir Mandic 3ea729badb update dependencies 2023-01-21 09:06:35 -05:00
Vladimir Mandic d36ed6d266 update changelog 2023-01-06 13:25:52 -05:00
Vladimir Mandic 4061d4d62f update tfjs 2023-01-06 13:24:17 -05:00
Vladimir Mandic b034c46f80 1.7.8 2023-01-06 13:04:31 -05:00
Vladimir Mandic aefd776a9e update dependencies 2022-12-21 14:14:22 -05:00
Vladimir Mandic 20eb54beb4 update 2022-12-04 14:14:05 -05:00
Vladimir Mandic e8301c5277 update 2022-12-04 13:23:41 -05:00
Vladimir Mandic fba823ba50 update tfjs 2022-12-01 14:56:40 -05:00
Vladimir Mandic a1cb6de1e8 1.7.7 2022-12-01 14:55:47 -05:00
Vladimir Mandic fb3836019f update dependencies 2022-11-12 11:54:00 -05:00
Vladimir Mandic 15ae496f40 update release 2022-10-18 07:23:49 -04:00
Vladimir Mandic 0009d1bc34 1.7.6 2022-10-18 07:23:04 -04:00
Vladimir Mandic adc4b3a11d update dependencies 2022-10-18 07:10:40 -04:00
Sohaib Ahmed 7e5a1289ff
Fix face angles (yaw, pitch, & roll) accuracy (#130)
Previouly derived aforementioned angles correctly seemed inaccurate and somewhat unusable (given their output was in radians). This update uses the a person's mesh positions, and chooses specific points for accurate results. It also adds directionality of the movements (_e.g. pitching head backwards is a negative result, as is rolling head to the left).

The webcam.js file has also been updated to showcase the correct output in degrees (reducing potential user confusion)

Comitter: Sohaib Ahmed <sohaibi.ahmed@icloud.com>

Co-authored-by: Sophia Glisch <sophiaglisch@Sophias-MacBook-Pro.local>
2022-10-18 07:09:35 -04:00
Vladimir Mandic cd2c553737 update tfjs 2022-10-14 08:01:39 -04:00
Vladimir Mandic a433fc0681 1.7.5 2022-10-09 13:42:45 -04:00
Vladimir Mandic f9902b0459 update readme 2022-10-09 13:42:38 -04:00
Vladimir Mandic bd5ab6bb0f update 2022-10-09 13:41:11 -04:00
Vladimir Mandic 96fed4f123 update tfjs 2022-10-09 13:40:33 -04:00
Vladimir Mandic 0cbfd9b01b update dependencies 2022-09-29 10:38:14 -04:00
Vladimir Mandic dea225bbeb
Create FUNDING.yml 2022-09-26 09:39:08 -04:00
Vladimir Mandic 602e86cbec add node-wasm demo 2022-09-25 16:40:42 -04:00
Vladimir Mandic 00bf49b24f 1.7.4 2022-09-25 16:39:22 -04:00
Vladimir Mandic fa33c1281c improve face compare performance 2022-09-14 08:18:51 -04:00
Vladimir Mandic 7f613367a3 update tfjs and typescript 2022-09-04 15:18:07 -04:00
Vladimir Mandic 4d65f459f9 update tfjs 2022-08-24 08:21:15 -04:00
Vladimir Mandic d28e5d2142 1.7.3 2022-08-24 08:20:11 -04:00
Vladimir Mandic 6aeb292453 refresh release 2022-08-23 08:26:07 -04:00
Vladimir Mandic 289faf17f2 1.7.2 2022-08-23 08:25:42 -04:00
Vladimir Mandic 7a6f7d96b7 document and remove optional dependencies 2022-08-23 08:21:20 -04:00
Vladimir Mandic 870eebedfa update dependencies 2022-08-22 13:17:39 -04:00
Vladimir Mandic 1ed702f713 update readme 2022-08-16 20:25:26 -04:00
Nina Egger b2a988e436
update readme 2022-08-03 15:14:56 -04:00
Vladimir Mandic 5c38676a83 update build platform 2022-07-29 09:24:51 -04:00
Vladimir Mandic bac0ef10cf update readme 2022-07-26 07:27:52 -04:00
Vladimir Mandic 8baef0ef68 update links 2022-07-25 08:38:52 -04:00
Vladimir Mandic c5dbb9d4e9 release build 2022-07-25 08:23:57 -04:00
Vladimir Mandic a8021dc2a3 1.7.1 2022-07-25 08:21:02 -04:00
Vladimir Mandic f946780bab refactor dependencies 2022-07-25 08:20:59 -04:00
Vladimir Mandic 8e7061a9aa full rebuild 2022-05-24 07:18:59 -04:00
Vladimir Mandic cd904ca5dd 1.6.11 2022-05-24 07:18:51 -04:00
Vladimir Mandic 496779fee2 1.6.10 2022-05-24 07:17:40 -04:00
Vladimir Mandic 4ba4a99ee1 update tfjs 2022-05-24 07:16:42 -04:00
Vladimir Mandic 31170e750b update changelog 2022-05-18 08:36:24 -04:00
Vladimir Mandic 5f58cd376d update tfjs 2022-05-18 08:36:05 -04:00
Vladimir Mandic 07eb00d7d6 1.6.9 2022-05-18 08:21:59 -04:00
Vladimir Mandic a1f7a0841f update libraries 2022-05-09 08:12:24 -04:00
Vladimir Mandic 49a594a59b 1.6.8 2022-05-09 08:11:31 -04:00
Vladimir Mandic 3b3ab219dc update dependencies 2022-04-09 09:48:06 -04:00
Vladimir Mandic 2fce7338dc exclude impossible detected face boxes 2022-04-05 07:38:11 -04:00
Vladimir Mandic 6cafeafba1 update tfjs 2022-04-01 09:16:17 -04:00
Vladimir Mandic d0f1349a23 1.6.7 2022-04-01 09:15:45 -04:00
abdemirza cdb0e485f8
fixed typo error (#97)
Co-authored-by: Abuzar Mirza <abdermiza@gmail.com>
2022-03-10 06:48:14 -05:00
Vladimir Mandic 5bcc4d2a73 update changelog 2022-03-07 13:17:54 -05:00
Vladimir Mandic 92008ed6f4 update tfjs and ts 2022-03-07 13:17:31 -05:00
Vladimir Mandic c1b38f99fe 1.6.6 2022-03-04 16:48:47 -05:00
413 changed files with 15547 additions and 152444 deletions

View File

@ -6,7 +6,7 @@
"output": "build.log" "output": "build.log"
}, },
"profiles": { "profiles": {
"production": ["clean", "compile", "typings", "typedoc", "lint", "changelog"], "production": ["compile", "typings", "typedoc", "lint", "changelog"],
"development": ["serve", "watch", "compile"] "development": ["serve", "watch", "compile"]
}, },
"clean": { "clean": {
@ -37,6 +37,13 @@
"banner": { "js": "/*\n Face-API\n homepage: <https://github.com/vladmandic/face-api>\n author: <https://github.com/vladmandic>'\n*/\n" } "banner": { "js": "/*\n Face-API\n homepage: <https://github.com/vladmandic/face-api>\n author: <https://github.com/vladmandic>'\n*/\n" }
}, },
"targets": [ "targets": [
{
"name": "tfjs/browser/tf-version",
"platform": "browser",
"format": "esm",
"input": "src/tfjs/tf-version.ts",
"output": "dist/tfjs.version.js"
},
{ {
"name": "tfjs/node/cpu", "name": "tfjs/node/cpu",
"platform": "node", "platform": "node",
@ -85,13 +92,6 @@
"output": "dist/face-api.node-wasm.js", "output": "dist/face-api.node-wasm.js",
"external": ["@tensorflow"] "external": ["@tensorflow"]
}, },
{
"name": "tfjs/browser/tf-version",
"platform": "browser",
"format": "esm",
"input": "src/tfjs/tf-version.ts",
"output": "dist/tfjs.version.js"
},
{ {
"name": "tfjs/browser/esm/nobundle", "name": "tfjs/browser/esm/nobundle",
"platform": "browser", "platform": "browser",

View File

@ -7,7 +7,7 @@
"es2020": true "es2020": true
}, },
"parser": "@typescript-eslint/parser", "parser": "@typescript-eslint/parser",
"parserOptions": { "ecmaVersion": 2020 }, "parserOptions": { "ecmaVersion": "latest" },
"plugins": [ "plugins": [
"@typescript-eslint" "@typescript-eslint"
], ],
@ -17,7 +17,6 @@
"plugin:import/warnings", "plugin:import/warnings",
"plugin:node/recommended", "plugin:node/recommended",
"plugin:promise/recommended", "plugin:promise/recommended",
"plugin:json/recommended-with-comments",
"plugin:@typescript-eslint/eslint-recommended", "plugin:@typescript-eslint/eslint-recommended",
"plugin:@typescript-eslint/recommended", "plugin:@typescript-eslint/recommended",
"airbnb-base" "airbnb-base"
@ -29,6 +28,8 @@
"@typescript-eslint/ban-ts-comment": "off", "@typescript-eslint/ban-ts-comment": "off",
"@typescript-eslint/explicit-module-boundary-types": "off", "@typescript-eslint/explicit-module-boundary-types": "off",
"@typescript-eslint/no-var-requires": "off", "@typescript-eslint/no-var-requires": "off",
"@typescript-eslint/no-empty-object-type": "off",
"@typescript-eslint/no-require-imports": "off",
"camelcase": "off", "camelcase": "off",
"class-methods-use-this": "off", "class-methods-use-this": "off",
"default-param-last": "off", "default-param-last": "off",

13
.github/FUNDING.yml vendored Normal file
View File

@ -0,0 +1,13 @@
# These are supported funding model platforms
github: [vladmandic]
patreon: # Replace with a single Patreon username
open_collective: # Replace with a single Open Collective username
ko_fi: # Replace with a single Ko-fi username
tidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel
community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
liberapay: # Replace with a single Liberapay username
issuehunt: # Replace with a single IssueHunt username
otechie: # Replace with a single Otechie username
lfx_crowdfunding: # Replace with a single LFX Crowdfunding project-name e.g., cloud-foundry
custom: # Replace with up to 4 custom sponsorship URLs e.g., ['link1', 'link2']

6
.npmrc
View File

@ -1 +1,5 @@
force = true force=true
production=true
legacy-peer-deps=true
strict-peer-dependencies=false
node-options='--no-deprecation'

3
.vscode/settings.json vendored Normal file
View File

@ -0,0 +1,3 @@
{
"typescript.tsdk": "node_modules/typescript/lib"
}

View File

@ -1,20 +1,107 @@
# # @vladmandic/face-api
Version: **undefined** Version: **1.7.15**
Description: **undefined** Description: **FaceAPI: AI-powered Face Detection & Rotation Tracking, Face Description & Recognition, Age & Gender & Emotion Prediction for Browser and NodeJS using TensorFlow/JS**
Author: **undefined** Author: **Vladimir Mandic <mandic00@live.com>**
License: **undefined** License: **MIT**
Repository: **<https://github.com/vladmandic/face-api>** Repository: **<https://github.com/vladmandic/face-api>**
## Changelog ## Changelog
### **1.7.15** 2025/02/05 mandic00@live.com
### **origin/master** 2024/09/10 mandic00@live.com
### **1.7.14** 2024/09/10 mandic00@live.com
- rebuild
- merge pull request #188 from rebser/master
- fixing leaking eventhandlers when using htmlcanvaselement
- rebuild types
- rebuild
### **1.7.13** 2024/01/17 mandic00@live.com
- merge pull request #186 from khwalkowicz/master
- feat: enable noimplicitany
### **release: 1.7.12** 2023/06/12 mandic00@live.com
### **1.7.12** 2023/06/12 mandic00@live.com
### **1.7.11** 2023/05/08 mandic00@live.com
### **1.7.10** 2023/03/21 mandic00@live.com
- change typedefs
### **1.7.9** 2023/01/29 mandic00@live.com
### **1.7.8** 2023/01/06 mandic00@live.com
### **1.7.7** 2022/12/01 mandic00@live.com
### **1.7.6** 2022/10/18 mandic00@live.com
- fix face angles (yaw, pitch, & roll) accuracy (#130)
### **1.7.5** 2022/10/09 mandic00@live.com
- create funding.yml
- add node-wasm demo
### **1.7.4** 2022/09/25 mandic00@live.com
- improve face compare performance
### **1.7.3** 2022/08/24 mandic00@live.com
- refresh release
### **1.7.2** 2022/08/23 mandic00@live.com
- document and remove optional dependencies
### **release: 1.7.1** 2022/07/25 mandic00@live.com
### **1.7.1** 2022/07/25 mandic00@live.com
- refactor dependencies
- full rebuild
### **1.6.11** 2022/05/24 mandic00@live.com
### **1.6.10** 2022/05/24 mandic00@live.com
### **1.6.9** 2022/05/18 mandic00@live.com
### **1.6.8** 2022/05/09 mandic00@live.com
- exclude impossible detected face boxes
### **1.6.7** 2022/04/01 mandic00@live.com
- fixed typo error (#97)
### **1.6.6** 2022/03/04 mandic00@live.com
### **1.6.5** 2022/02/07 mandic00@live.com ### **1.6.5** 2022/02/07 mandic00@live.com
### **origin/master** 2022/01/14 mandic00@live.com
### **1.6.4** 2022/01/14 mandic00@live.com ### **1.6.4** 2022/01/14 mandic00@live.com
- add node with wasm build target - add node with wasm build target

108
README.md
View File

@ -55,7 +55,7 @@ Example can be accessed directly using Git pages using URL:
NodeJS examples are: NodeJS examples are:
- `/demp/node-simple.js`: - `/demo/node-simple.js`:
Simplest possible NodeJS demo for FaceAPI in under 30 lines of JavaScript code Simplest possible NodeJS demo for FaceAPI in under 30 lines of JavaScript code
- `/demo/node.js`: - `/demo/node.js`:
Using `TFJS` native methods to load images without external dependencies Using `TFJS` native methods to load images without external dependencies
@ -104,8 +104,11 @@ NodeJS examples are:
2021-03-14 08:42:09 STATE: Main: worker exit: 1888019 0 2021-03-14 08:42:09 STATE: Main: worker exit: 1888019 0
``` ```
Note that `@tensorflow/tfjs-node` or `@tensorflow/tfjs-node-gpu` ### NodeJS Notes
must be installed before using any **NodeJS** examples - Supported NodeJS versions are **14** up to **22**
NodeJS version **23** and higher are not supported due to incompatibility with TensorFlow/JS
- `@tensorflow/tfjs-node` or `@tensorflow/tfjs-node-gpu`
must be installed before using any **NodeJS** examples
<br><hr><br> <br><hr><br>
@ -258,7 +261,7 @@ If you want to GPU Accelerated execution in NodeJS, you must have CUDA libraries
Then install appropriate version of `FaceAPI`: Then install appropriate version of `FaceAPI`:
```shell ```shell
npm install @tensorflow/tfjs-node npm install @tensorflow/tfjs-node-gpu
npm install @vladmandic/face-api npm install @vladmandic/face-api
``` ```
@ -269,18 +272,24 @@ And then use with:
const faceapi = require('@vladmandic/face-api/dist/face-api.node-gpu.js'); // this loads face-api version with correct bindings for tfjs-node-gpu const faceapi = require('@vladmandic/face-api/dist/face-api.node-gpu.js'); // this loads face-api version with correct bindings for tfjs-node-gpu
``` ```
If you want to use `FaceAPI` in a NodeJS on platforms where NodeJS binary libraries are not supported, you can use JavaScript CPU backend. If you want to use `FaceAPI` in a NodeJS on platforms where **tensorflow** binary libraries are not supported, you can use NodeJS **WASM** backend.
```shell ```shell
npm install @tensorflow/tfjs npm install @tensorflow/tfjs
npm install @tensorflow/tfjs-backend-wasm
npm install @vladmandic/face-api npm install @vladmandic/face-api
``` ```
And then use with: And then use with:
```js ```js
const tf = require('@tensorflow/tfjs') const tf = require('@tensorflow/tfjs');
const faceapi = require('@vladmandic/face-api/dist/face-api.node-cpu.js'); const wasm = require('@tensorflow/tfjs-backend-wasm');
const faceapi = require('@vladmandic/face-api/dist/face-api.node-wasm.js'); // use this when using face-api in dev mode
wasm.setWasmPaths('https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-backend-wasm/dist/');
await tf.setBackend('wasm');
await tf.ready();
...
``` ```
If you want to use graphical functions inside NodeJS, If you want to use graphical functions inside NodeJS,
@ -308,12 +317,14 @@ faceapi.env.monkeyPatch({ Canvas, Image, ImageData })
## Weights ## Weights
Pretrained models and their weights are includes in `./model`. Pretrained models and their weights are included in `./model`.
<br><hr><br> <br><hr><br>
## Test & Dev Web Server ## Test & Dev Web Server
To install development dependencies, use `npm install --production=false`
Built-in test&dev web server can be started using Built-in test&dev web server can be started using
```shell ```shell
@ -389,43 +400,41 @@ cd face-api
Then install all dependencies and run rebuild: Then install all dependencies and run rebuild:
```shell ```shell
npm install npm install --production=false
npm run build npm run build
``` ```
Build process uses `@vladmandic/build` module that creates optimized build for each target: Build process uses `@vladmandic/build` module that creates optimized build for each target:
```text
> @vladmandic/face-api@1.0.2 build
> rimraf dist/* types/* typedoc/* && node server/build.js
```
```js ```js
2022-01-14 09:54:23 INFO: Application: { name: '@vladmandic/face-api', version: '1.6.4' } > @vladmandic/face-api@1.7.1 build /home/vlado/dev/face-api
2022-01-14 09:54:23 INFO: Environment: { profile: 'production', config: '.build.json', package: 'package.json', tsconfig: true, eslintrc: true, git: true } > node build.js
2022-01-14 09:54:23 INFO: Toolchain: { build: '0.6.7', esbuild: '0.14.11', typescript: '4.5.4', typedoc: '0.22.10', eslint: '8.6.0' }
2022-01-14 09:54:23 INFO: Build: { profile: 'production', steps: [ 'clean', 'compile', 'typings', 'typedoc', 'lint', 'changelog' ] } 2022-07-25 08:21:05 INFO: Application: { name: '@vladmandic/face-api', version: '1.7.1' }
2022-01-14 09:54:23 STATE: Clean: { locations: [ 'dist/*', 'typedoc/*', 'types/lib/src' ] } 2022-07-25 08:21:05 INFO: Environment: { profile: 'production', config: '.build.json', package: 'package.json', tsconfig: true, eslintrc: true, git: true }
2022-01-14 09:54:23 STATE: Compile: { name: 'tfjs/node/cpu', format: 'cjs', platform: 'node', input: 'src/tfjs/tf-node.ts', output: 'dist/tfjs.esm.js', files: 1, inputBytes: 143, outputBytes: 1276 } 2022-07-25 08:21:05 INFO: Toolchain: { build: '0.7.7', esbuild: '0.14.50', typescript: '4.7.4', typedoc: '0.23.9', eslint: '8.20.0' }
2022-01-14 09:54:23 STATE: Compile: { name: 'faceapi/node/cpu', format: 'cjs', platform: 'node', input: 'src/index.ts', output: 'dist/face-api.node.js', files: 162, inputBytes: 234787, outputBytes: 175203 } 2022-07-25 08:21:05 INFO: Build: { profile: 'production', steps: [ 'clean', 'compile', 'typings', 'typedoc', 'lint', 'changelog' ] }
2022-01-14 09:54:23 STATE: Compile: { name: 'tfjs/node/gpu', format: 'cjs', platform: 'node', input: 'src/tfjs/tf-node-gpu.ts', output: 'dist/tfjs.esm.js', files: 1, inputBytes: 147, outputBytes: 1296 } 2022-07-25 08:21:05 STATE: Clean: { locations: [ 'dist/*', 'typedoc/*', 'types/lib/src' ] }
2022-01-14 09:54:23 STATE: Compile: { name: 'faceapi/node/gpu', format: 'cjs', platform: 'node', input: 'src/index.ts', output: 'dist/face-api.node-gpu.js', files: 162, inputBytes: 234807, outputBytes: 175219 } 2022-07-25 08:21:05 STATE: Compile: { name: 'tfjs/node/cpu', format: 'cjs', platform: 'node', input: 'src/tfjs/tf-node.ts', output: 'dist/tfjs.esm.js', files: 1, inputBytes: 143, outputBytes: 614 }
2022-01-14 09:54:23 STATE: Compile: { name: 'tfjs/node/wasm', format: 'cjs', platform: 'node', input: 'src/tfjs/tf-node-wasm.ts', output: 'dist/tfjs.esm.js', files: 1, inputBytes: 185, outputBytes: 1367 } 2022-07-25 08:21:05 STATE: Compile: { name: 'faceapi/node/cpu', format: 'cjs', platform: 'node', input: 'src/index.ts', output: 'dist/face-api.node.js', files: 162, inputBytes: 234137, outputBytes: 85701 }
2022-01-14 09:54:23 STATE: Compile: { name: 'faceapi/node/wasm', format: 'cjs', platform: 'node', input: 'src/index.ts', output: 'dist/face-api.node-wasm.js', files: 162, inputBytes: 234878, outputBytes: 175294 } 2022-07-25 08:21:05 STATE: Compile: { name: 'tfjs/node/gpu', format: 'cjs', platform: 'node', input: 'src/tfjs/tf-node-gpu.ts', output: 'dist/tfjs.esm.js', files: 1, inputBytes: 147, outputBytes: 618 }
2022-01-14 09:54:23 STATE: Compile: { name: 'tfjs/browser/tf-version', format: 'esm', platform: 'browser', input: 'src/tfjs/tf-version.ts', output: 'dist/tfjs.version.js', files: 1, inputBytes: 1063, outputBytes: 1662 } 2022-07-25 08:21:05 STATE: Compile: { name: 'faceapi/node/gpu', format: 'cjs', platform: 'node', input: 'src/index.ts', output: 'dist/face-api.node-gpu.js', files: 162, inputBytes: 234141, outputBytes: 85705 }
2022-01-14 09:54:23 STATE: Compile: { name: 'tfjs/browser/esm/nobundle', format: 'esm', platform: 'browser', input: 'src/tfjs/tf-browser.ts', output: 'dist/tfjs.esm.js', files: 2, inputBytes: 2172, outputBytes: 811 } 2022-07-25 08:21:05 STATE: Compile: { name: 'tfjs/node/wasm', format: 'cjs', platform: 'node', input: 'src/tfjs/tf-node-wasm.ts', output: 'dist/tfjs.esm.js', files: 1, inputBytes: 185, outputBytes: 670 }
2022-01-14 09:54:23 STATE: Compile: { name: 'faceapi/browser/esm/nobundle', format: 'esm', platform: 'browser', input: 'src/index.ts', output: 'dist/face-api.esm-nobundle.js', files: 162, inputBytes: 234322, outputBytes: 169437 } 2022-07-25 08:21:05 STATE: Compile: { name: 'faceapi/node/wasm', format: 'cjs', platform: 'node', input: 'src/index.ts', output: 'dist/face-api.node-wasm.js', files: 162, inputBytes: 234193, outputBytes: 85755 }
2022-01-14 09:54:24 STATE: Compile: { name: 'tfjs/browser/esm/bundle', format: 'esm', platform: 'browser', input: 'src/tfjs/tf-browser.ts', output: 'dist/tfjs.esm.js', files: 11, inputBytes: 2172, outputBytes: 2444105 } 2022-07-25 08:21:05 STATE: Compile: { name: 'tfjs/browser/tf-version', format: 'esm', platform: 'browser', input: 'src/tfjs/tf-version.ts', output: 'dist/tfjs.version.js', files: 1, inputBytes: 1063, outputBytes: 400 }
2022-01-14 09:54:24 STATE: Compile: { name: 'faceapi/browser/iife/bundle', format: 'iife', platform: 'browser', input: 'src/index.ts', output: 'dist/face-api.js', files: 162, inputBytes: 2677616, outputBytes: 1252572 } 2022-07-25 08:21:05 STATE: Compile: { name: 'tfjs/browser/esm/nobundle', format: 'esm', platform: 'browser', input: 'src/tfjs/tf-browser.ts', output: 'dist/tfjs.esm.js', files: 2, inputBytes: 910, outputBytes: 527 }
2022-01-14 09:54:24 STATE: Compile: { name: 'faceapi/browser/esm/bundle', format: 'esm', platform: 'browser', input: 'src/index.ts', output: 'dist/face-api.esm.js', files: 162, inputBytes: 2677616, outputBytes: 2435063 } 2022-07-25 08:21:05 STATE: Compile: { name: 'faceapi/browser/esm/nobundle', format: 'esm', platform: 'browser', input: 'src/index.ts', output: 'dist/face-api.esm-nobundle.js', files: 162, inputBytes: 234050, outputBytes: 82787 }
2022-01-14 09:54:27 STATE: Typings: { input: 'src/index.ts', output: 'types/lib', files: 93 } 2022-07-25 08:21:05 STATE: Compile: { name: 'tfjs/browser/esm/bundle', format: 'esm', platform: 'browser', input: 'src/tfjs/tf-browser.ts', output: 'dist/tfjs.esm.js', files: 11, inputBytes: 910, outputBytes: 1184871 }
2022-01-14 09:54:31 STATE: TypeDoc: { input: 'src/index.ts', output: 'typedoc', objects: 154, generated: true } 2022-07-25 08:21:05 STATE: Compile: { name: 'faceapi/browser/iife/bundle', format: 'iife', platform: 'browser', input: 'src/index.ts', output: 'dist/face-api.js', files: 162, inputBytes: 1418394, outputBytes: 1264631 }
2022-01-14 09:54:45 STATE: Lint: { locations: [ 'src/' ], files: 174, errors: 0, warnings: 0 } 2022-07-25 08:21:05 STATE: Compile: { name: 'faceapi/browser/esm/bundle', format: 'esm', platform: 'browser', input: 'src/index.ts', output: 'dist/face-api.esm.js', files: 162, inputBytes: 1418394, outputBytes: 1264150 }
2022-01-14 09:54:45 STATE: ChangeLog: { repository: 'https://github.com/vladmandic/face-api', branch: 'master', output: 'CHANGELOG.md' } 2022-07-25 08:21:07 STATE: Typings: { input: 'src/index.ts', output: 'types/lib', files: 93 }
2022-01-14 09:54:45 INFO: Done... 2022-07-25 08:21:09 STATE: TypeDoc: { input: 'src/index.ts', output: 'typedoc', objects: 154, generated: true }
2022-01-14 09:54:45 STATE: Copy: { input: 'types/lib/dist/tfjs.esm.d.ts' } 2022-07-25 08:21:13 STATE: Lint: { locations: [ 'src/' ], files: 174, errors: 0, warnings: 0 }
2022-01-14 09:54:46 STATE: API-Extractor: { succeeeded: true, errors: 0, warnings: 414 } 2022-07-25 08:21:14 STATE: ChangeLog: { repository: 'https://github.com/vladmandic/face-api', branch: 'master', output: 'CHANGELOG.md' }
2022-01-14 09:54:46 INFO: FaceAPI Build complete... 2022-07-25 08:21:14 INFO: Done...
2022-07-25 08:21:14 STATE: Copy: { input: 'types/lib/dist/tfjs.esm.d.ts' }
2022-07-25 08:21:15 STATE: API-Extractor: { succeeeded: true, errors: 0, warnings: 417 }
2022-07-25 08:21:15 INFO: FaceAPI Build complete...
``` ```
<br><hr><br> <br><hr><br>
@ -440,18 +449,14 @@ Build process uses `@vladmandic/build` module that creates optimized build for e
## Note ## Note
This is updated **face-api.js** with latest available TensorFlow/JS as the original is not compatible with **tfjs 2.0+**. This is updated **face-api.js** with latest available TensorFlow/JS as the original is not compatible with **tfjs >=2.0**.
Forked from [face-api.js](https://github.com/justadudewhohacks/face-api.js) version **0.22.2** which was released on March 22nd, 2020 Forked from [face-api.js](https://github.com/justadudewhohacks/face-api.js) version **0.22.2** which was released on March 22nd, 2020
Currently using **`TensorFlow/JS` 3.13.0** *Why?* I needed a FaceAPI that does not cause version conflict with newer versions of TensorFlow
And since the original FaceAPI was open-source, I've released this version as well
*Why?* I needed FaceAPI that does not cause version conflict with newer versions of TensorFlow Changes ended up being too large for a simple pull request and it ended up being a full-fledged version on its own
And since original FaceAPI was open-source, I've released this version as well Plus many features were added since the original inception
Changes ended up being too large for a simple pull request
and it ended up being a full-fledged version on its own
Plus many features were added since original inception
Although a lot of work has gone into this version of `FaceAPI` and it will continue to be maintained, Although a lot of work has gone into this version of `FaceAPI` and it will continue to be maintained,
at this time it is completely superseded by my newer library `Human` which covers the same use cases, at this time it is completely superseded by my newer library `Human` which covers the same use cases,
@ -466,11 +471,12 @@ but extends it with newer AI models, additional detection details, compatibility
Compared to [face-api.js](https://github.com/justadudewhohacks/face-api.js) version **0.22.2**: Compared to [face-api.js](https://github.com/justadudewhohacks/face-api.js) version **0.22.2**:
- Compatible with `TensorFlow/JS 2.0+ & 3.0+` - Compatible with `TensorFlow/JS 2.0+, 3.0+ and 4.0+`
Currently using **`TensorFlow/JS` 4.16**
Original `face-api.js` is based on `TFJS` **1.7.4** Original `face-api.js` is based on `TFJS` **1.7.4**
- Compatible with `WebGL`, `CPU` and `WASM` TFJS Browser backends - Compatible with `WebGL`, `CPU` and `WASM` TFJS Browser backends
- Compatible with both `tfjs-node` and `tfjs-node-gpu` TFJS NodeJS backends - Compatible with both `tfjs-node` and `tfjs-node-gpu` TFJS NodeJS backends
- Updated all type castings for TypeScript type checking to `TypeScript 4.5` - Updated all type castings for TypeScript type checking to `TypeScript 5.3`
- Switched bundling from `UMD` to `ESM` + `CommonJS` with fallback to `IIFE` - Switched bundling from `UMD` to `ESM` + `CommonJS` with fallback to `IIFE`
Resulting code is optimized per-platform instead of being universal Resulting code is optimized per-platform instead of being universal
Fully tree shakable when imported as an `ESM` module Fully tree shakable when imported as an `ESM` module
@ -492,7 +498,7 @@ Compared to [face-api.js](https://github.com/justadudewhohacks/face-api.js) vers
- Added `face angle` calculations that returns `roll`, `yaw` and `pitch` - Added `face angle` calculations that returns `roll`, `yaw` and `pitch`
- Added `typdoc` automatic API specification generation during build - Added `typdoc` automatic API specification generation during build
- Added `changelog` automatic generation during build - Added `changelog` automatic generation during build
- Created new process to generate **TypeDocs** bundle using API-Extractor - New process to generate **TypeDocs** bundle using API-Extractor
<br> <br>

View File

@ -3,13 +3,44 @@ const log = require('@vladmandic/pilogger');
const Build = require('@vladmandic/build').Build; const Build = require('@vladmandic/build').Build;
const APIExtractor = require('@microsoft/api-extractor'); const APIExtractor = require('@microsoft/api-extractor');
// eslint-disable-next-line no-unused-vars, @typescript-eslint/no-unused-vars const regEx = [
function copy(src, dst) { { search: 'types="@webgpu/types/dist"', replace: 'path="../src/types/webgpu.d.ts"' },
if (!fs.existsSync(src)) return; { search: 'types="offscreencanvas"', replace: 'path="../src/types/offscreencanvas.d.ts"' },
];
function copyFile(src, dst) {
if (!fs.existsSync(src)) {
log.warn('Copy:', { input: src, output: dst });
return;
}
log.state('Copy:', { input: src, output: dst });
const buffer = fs.readFileSync(src); const buffer = fs.readFileSync(src);
fs.writeFileSync(dst, buffer); fs.writeFileSync(dst, buffer);
} }
function writeFile(str, dst) {
log.state('Write:', { output: dst });
fs.writeFileSync(dst, str);
}
function regExFile(src, entries) {
if (!fs.existsSync(src)) {
log.warn('Filter:', { src });
return;
}
log.state('Filter:', { input: src });
for (const entry of entries) {
const buffer = fs.readFileSync(src, 'UTF-8');
const lines = buffer.split(/\r?\n/);
const out = [];
for (const line of lines) {
if (line.includes(entry.search)) out.push(line.replace(entry.search, entry.replace));
else out.push(line);
}
fs.writeFileSync(src, out.join('\n'));
}
}
const apiIgnoreList = ['ae-forgotten-export', 'ae-unresolved-link', 'tsdoc-param-tag-missing-hyphen']; const apiIgnoreList = ['ae-forgotten-export', 'ae-unresolved-link', 'tsdoc-param-tag-missing-hyphen'];
async function main() { async function main() {
@ -18,7 +49,7 @@ async function main() {
await build.run('production'); await build.run('production');
// patch tfjs typedefs // patch tfjs typedefs
log.state('Copy:', { input: 'types/lib/dist/tfjs.esm.d.ts' }); log.state('Copy:', { input: 'types/lib/dist/tfjs.esm.d.ts' });
copy('types/lib/dist/tfjs.esm.d.ts', 'dist/tfjs.esm.d.ts'); copyFile('types/lib/dist/tfjs.esm.d.ts', 'dist/tfjs.esm.d.ts');
// run api-extractor to create typedef rollup // run api-extractor to create typedef rollup
const extractorConfig = APIExtractor.ExtractorConfig.loadFileAndPrepare('api-extractor.json'); const extractorConfig = APIExtractor.ExtractorConfig.loadFileAndPrepare('api-extractor.json');
const extractorResult = APIExtractor.Extractor.invoke(extractorConfig, { const extractorResult = APIExtractor.Extractor.invoke(extractorConfig, {
@ -33,16 +64,13 @@ async function main() {
}, },
}); });
log.state('API-Extractor:', { succeeeded: extractorResult.succeeded, errors: extractorResult.errorCount, warnings: extractorResult.warningCount }); log.state('API-Extractor:', { succeeeded: extractorResult.succeeded, errors: extractorResult.errorCount, warnings: extractorResult.warningCount });
// distribute typedefs regExFile('types/face-api.d.ts', regEx);
/* writeFile('export * from \'../types/face-api\';', 'dist/face-api.esm-nobundle.d.ts');
log.state('Copy:', { input: 'types/human.d.ts' }); writeFile('export * from \'../types/face-api\';', 'dist/face-api.esm.d.ts');
copy('types/human.d.ts', 'dist/human.esm-nobundle.d.ts'); writeFile('export * from \'../types/face-api\';', 'dist/face-api.d.ts');
copy('types/human.d.ts', 'dist/human.esm.d.ts'); writeFile('export * from \'../types/face-api\';', 'dist/face-api.node.d.ts');
copy('types/human.d.ts', 'dist/human.d.ts'); writeFile('export * from \'../types/face-api\';', 'dist/face-api.node-gpu.d.ts');
copy('types/human.d.ts', 'dist/human.node-gpu.d.ts'); writeFile('export * from \'../types/face-api\';', 'dist/face-api.node-wasm.d.ts');
copy('types/human.d.ts', 'dist/human.node.d.ts');
copy('types/human.d.ts', 'dist/human.node-wasm.d.ts');
*/
log.info('FaceAPI Build complete...'); log.info('FaceAPI Build complete...');
} }

View File

@ -11,7 +11,7 @@
<link rel="shortcut icon" href="../favicon.ico" type="image/x-icon"> <link rel="shortcut icon" href="../favicon.ico" type="image/x-icon">
<script src="./index.js" type="module"></script> <script src="./index.js" type="module"></script>
</head> </head>
<body style="font-family: monospace; background: black; color: white; font-size: 16px; line-height: 22px; margin: 0;"> <body style="font-family: monospace; background: black; color: white; font-size: 16px; line-height: 22px; margin: 0; overflow-x: hidden;">
<div id="log"></div> <div id="log"></div>
</body> </body>
</html> </html>

View File

@ -1,8 +1,14 @@
import * as faceapi from '../dist/face-api.esm.js'; /**
* FaceAPI Demo for Browsers
* Loaded via `index.html`
*/
import * as faceapi from '../dist/face-api.esm.js'; // use when in dev mode
// import * as faceapi from '@vladmandic/face-api'; // use when downloading face-api as npm
// configuration options // configuration options
const modelPath = '../model/'; // path to model folder that will be loaded using http const modelPath = '../model/'; // path to model folder that will be loaded using http
// const modelPath = 'https://vladmandic.github.io/face-api/model/'; // path to model folder that will be loaded using http // const modelPath = 'https://cdn.jsdelivr.net/npm/@vladmandic/face-api/model/'; // path to model folder that will be loaded using http
const imgSize = 800; // maximum image size in pixels const imgSize = 800; // maximum image size in pixels
const minScore = 0.3; // minimum score const minScore = 0.3; // minimum score
const maxResults = 10; // maximum number of results to return const maxResults = 10; // maximum number of results to return
@ -13,8 +19,7 @@ const str = (json) => (json ? JSON.stringify(json).replace(/{|}|"|\[|\]/g, '').r
// helper function to print strings to html document as a log // helper function to print strings to html document as a log
function log(...txt) { function log(...txt) {
// eslint-disable-next-line no-console console.log(...txt); // eslint-disable-line no-console
console.log(...txt);
const div = document.getElementById('log'); const div = document.getElementById('log');
if (div) div.innerHTML += `<br>${txt}`; if (div) div.innerHTML += `<br>${txt}`;
} }
@ -28,11 +33,9 @@ function faces(name, title, id, data) {
canvas.style.position = 'absolute'; canvas.style.position = 'absolute';
canvas.style.left = `${img.offsetLeft}px`; canvas.style.left = `${img.offsetLeft}px`;
canvas.style.top = `${img.offsetTop}px`; canvas.style.top = `${img.offsetTop}px`;
// @ts-ignore
canvas.width = img.width; canvas.width = img.width;
// @ts-ignore
canvas.height = img.height; canvas.height = img.height;
const ctx = canvas.getContext('2d'); const ctx = canvas.getContext('2d', { willReadFrequently: true });
if (!ctx) return; if (!ctx) return;
// draw title // draw title
ctx.font = '1rem sans-serif'; ctx.font = '1rem sans-serif';
@ -48,6 +51,7 @@ function faces(name, title, id, data) {
ctx.beginPath(); ctx.beginPath();
ctx.rect(person.detection.box.x, person.detection.box.y, person.detection.box.width, person.detection.box.height); ctx.rect(person.detection.box.x, person.detection.box.y, person.detection.box.width, person.detection.box.height);
ctx.stroke(); ctx.stroke();
// draw text labels
ctx.globalAlpha = 1; ctx.globalAlpha = 1;
ctx.fillText(`${Math.round(100 * person.genderProbability)}% ${person.gender}`, person.detection.box.x, person.detection.box.y - 18); ctx.fillText(`${Math.round(100 * person.genderProbability)}% ${person.gender}`, person.detection.box.x, person.detection.box.y - 18);
ctx.fillText(`${Math.round(person.age)} years`, person.detection.box.x, person.detection.box.y - 2); ctx.fillText(`${Math.round(person.age)} years`, person.detection.box.x, person.detection.box.y - 2);
@ -67,8 +71,7 @@ function faces(name, title, id, data) {
// helper function to draw processed image and its results // helper function to draw processed image and its results
function print(title, img, data) { function print(title, img, data) {
// eslint-disable-next-line no-console console.log('Results:', title, img, data); // eslint-disable-line no-console
console.log('Results:', title, img, data);
const el = new Image(); const el = new Image();
el.id = Math.floor(Math.random() * 100000).toString(); el.id = Math.floor(Math.random() * 100000).toString();
el.src = img; el.src = img;
@ -91,7 +94,7 @@ async function image(url) {
const canvas = document.createElement('canvas'); const canvas = document.createElement('canvas');
canvas.height = img.height; canvas.height = img.height;
canvas.width = img.width; canvas.width = img.width;
const ctx = canvas.getContext('2d'); const ctx = canvas.getContext('2d', { willReadFrequently: true });
if (ctx) ctx.drawImage(img, 0, 0, img.width, img.height); if (ctx) ctx.drawImage(img, 0, 0, img.width, img.height);
// return generated canvas to be used by tfjs during detection // return generated canvas to be used by tfjs during detection
resolve(canvas); resolve(canvas);
@ -105,19 +108,20 @@ async function main() {
// initialize tfjs // initialize tfjs
log('FaceAPI Test'); log('FaceAPI Test');
const params = new URLSearchParams(location.search); // if you want to use wasm backend location for wasm binaries must be specified
if (params.has('backend')) { // await faceapi.tf?.setWasmPaths(`https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-backend-wasm@${faceapi.tf.version_core}/dist/`);
const backend = params.get('backend'); // await faceapi.tf?.setBackend('wasm');
await faceapi.tf.setWasmPaths('https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-backend-wasm@3.13.0/dist/'); // log(`WASM SIMD: ${await faceapi.tf?.env().getAsync('WASM_HAS_SIMD_SUPPORT')} Threads: ${await faceapi.tf?.env().getAsync('WASM_HAS_MULTITHREAD_SUPPORT') ? 'Multi' : 'Single'}`);
log(`Chosen backend: ${backend}`);
await faceapi.tf.setBackend(backend);
} else {
// default is webgl backend
await faceapi.tf.setBackend('webgl');
}
// default is webgl backend
await faceapi.tf.setBackend('webgl');
await faceapi.tf.ready();
// tfjs optimizations
if (faceapi.tf?.env().flagRegistry.CANVAS2D_WILL_READ_FREQUENTLY) faceapi.tf.env().set('CANVAS2D_WILL_READ_FREQUENTLY', true);
if (faceapi.tf?.env().flagRegistry.WEBGL_EXP_CONV) faceapi.tf.env().set('WEBGL_EXP_CONV', true);
if (faceapi.tf?.env().flagRegistry.WEBGL_EXP_CONV) faceapi.tf.env().set('WEBGL_EXP_CONV', true);
await faceapi.tf.enableProdMode(); await faceapi.tf.enableProdMode();
await faceapi.tf.ENV.set('DEBUG', false);
await faceapi.tf.ready(); await faceapi.tf.ready();
// check version // check version
@ -139,16 +143,9 @@ async function main() {
const engine = await faceapi.tf.engine(); const engine = await faceapi.tf.engine();
log(`TF Engine State: ${str(engine.state)}`); log(`TF Engine State: ${str(engine.state)}`);
// const testT = faceapi.tf.tensor([0]);
// const testF = testT.toFloat();
// console.log(testT.print(), testF.print());
// testT.dispose();
// testF.dispose();
// loop through all images and try to process them // loop through all images and try to process them
log(`Start processing: ${samples.length} images ...<br>`); log(`Start processing: ${samples.length} images ...<br>`);
for (const img of samples) { for (const img of samples) {
// new line
document.body.appendChild(document.createElement('br')); document.body.appendChild(document.createElement('br'));
// load and resize image // load and resize image
const canvas = await image(img); const canvas = await image(img);
@ -174,8 +171,6 @@ async function main() {
print('SSDMobileNet:', img, dataSSDMobileNet); print('SSDMobileNet:', img, dataSSDMobileNet);
} catch (err) { } catch (err) {
log(`Image: ${img} Error during processing ${str(err)}`); log(`Image: ${img} Error during processing ${str(err)}`);
// eslint-disable-next-line no-console
console.error(err);
} }
} }
} }

View File

@ -1,13 +1,20 @@
/**
* FaceAPI Demo for NodeJS
* - Uses external library [canvas](https://www.npmjs.com/package/canvas) to decode image
* - Loads image from provided param
* - Outputs results to console
*/
// canvas library provides full canvas (load/draw/write) functionality for nodejs
// must be installed manually as it just a demo dependency and not actual face-api dependency
const canvas = require('canvas'); // eslint-disable-line node/no-missing-require
const fs = require('fs'); const fs = require('fs');
const path = require('path'); const path = require('path');
const process = require('process'); const process = require('process');
const log = require('@vladmandic/pilogger'); const log = require('@vladmandic/pilogger');
const canvas = require('canvas');
// eslint-disable-next-line import/no-extraneous-dependencies, no-unused-vars, @typescript-eslint/no-unused-vars
const tf = require('@tensorflow/tfjs-node'); // in nodejs environments tfjs-node is required to be loaded before face-api const tf = require('@tensorflow/tfjs-node'); // in nodejs environments tfjs-node is required to be loaded before face-api
// const faceapi = require('@vladmandic/face-api'); // use this when face-api is installed as module (majority of use cases)
const faceapi = require('../dist/face-api.node.js'); // use this when using face-api in dev mode const faceapi = require('../dist/face-api.node.js'); // use this when using face-api in dev mode
// const faceapi = require('@vladmandic/face-api'); // use this when face-api is installed as module (majority of use cases)
const modelPathRoot = '../model'; const modelPathRoot = '../model';
const imgPathRoot = './demo'; // modify to include your sample images const imgPathRoot = './demo'; // modify to include your sample images
@ -50,11 +57,9 @@ async function main() {
faceapi.env.monkeyPatch({ Canvas: canvas.Canvas, Image: canvas.Image, ImageData: canvas.ImageData }); faceapi.env.monkeyPatch({ Canvas: canvas.Canvas, Image: canvas.Image, ImageData: canvas.ImageData });
await faceapi.tf.setBackend('tensorflow'); await faceapi.tf.setBackend('tensorflow');
await faceapi.tf.enableProdMode();
await faceapi.tf.ENV.set('DEBUG', false);
await faceapi.tf.ready(); await faceapi.tf.ready();
log.state(`Version: TensorFlow/JS ${faceapi.tf?.version_core} FaceAPI ${faceapi.version} Backend: ${faceapi.tf?.getBackend()}`); log.state(`Version: FaceAPI ${faceapi.version} TensorFlow/JS ${tf.version_core} Backend: ${faceapi.tf?.getBackend()}`);
log.info('Loading FaceAPI models'); log.info('Loading FaceAPI models');
const modelPath = path.join(__dirname, modelPathRoot); const modelPath = path.join(__dirname, modelPathRoot);

35
demo/node-face-compare.js Normal file
View File

@ -0,0 +1,35 @@
/**
* FaceAPI demo that loads two images and finds similarity most prominant face in each image
*/
const fs = require('fs');
const tf = require('@tensorflow/tfjs-node');
const faceapi = require('../dist/face-api.node');
let optionsSSDMobileNet;
const getDescriptors = async (imageFile) => {
const buffer = fs.readFileSync(imageFile);
const tensor = tf.node.decodeImage(buffer, 3);
const faces = await faceapi.detectAllFaces(tensor, optionsSSDMobileNet)
.withFaceLandmarks()
.withFaceDescriptors();
tf.dispose(tensor);
return faces.map((face) => face.descriptor);
};
const main = async (file1, file2) => {
console.log('input images:', file1, file2); // eslint-disable-line no-console
await tf.ready();
await faceapi.nets.ssdMobilenetv1.loadFromDisk('model');
optionsSSDMobileNet = new faceapi.SsdMobilenetv1Options({ minConfidence: 0.5, maxResults: 1 });
await faceapi.nets.faceLandmark68Net.loadFromDisk('model');
await faceapi.nets.faceRecognitionNet.loadFromDisk('model');
const desc1 = await getDescriptors(file1);
const desc2 = await getDescriptors(file2);
const distance = faceapi.euclideanDistance(desc1[0], desc2[0]); // only compare first found face in each image
console.log('distance between most prominant detected faces:', distance); // eslint-disable-line no-console
console.log('similarity between most prominant detected faces:', 1 - distance); // eslint-disable-line no-console
};
main('demo/sample1.jpg', 'demo/sample2.jpg');

View File

@ -1,12 +1,18 @@
const fs = require('fs'); /**
// eslint-disable-next-line import/no-extraneous-dependencies, node/no-unpublished-require * FaceAPI Demo for NodeJS
const image = require('@canvas/image'); // @canvas/image can decode jpeg, png, webp * - Uses external library [@canvas/image](https://www.npmjs.com/package/@canvas/image) to decode image
const log = require('@vladmandic/pilogger'); * - Loads image from provided param
* - Outputs results to console
*/
// eslint-disable-next-line import/no-extraneous-dependencies, no-unused-vars, @typescript-eslint/no-unused-vars // @canvas/image can decode jpeg, png, webp
// must be installed manually as it just a demo dependency and not actual face-api dependency
const image = require('@canvas/image'); // eslint-disable-line node/no-missing-require
const fs = require('fs');
const log = require('@vladmandic/pilogger');
const tf = require('@tensorflow/tfjs-node'); // in nodejs environments tfjs-node is required to be loaded before face-api const tf = require('@tensorflow/tfjs-node'); // in nodejs environments tfjs-node is required to be loaded before face-api
// const faceapi = require('@vladmandic/face-api'); // use this when face-api is installed as module (majority of use cases)
const faceapi = require('../dist/face-api.node.js'); // use this when using face-api in dev mode const faceapi = require('../dist/face-api.node.js'); // use this when using face-api in dev mode
// const faceapi = require('@vladmandic/face-api'); // use this when face-api is installed as module (majority of use cases)
const modelPath = 'model/'; const modelPath = 'model/';
const imageFile = 'demo/sample1.jpg'; const imageFile = 'demo/sample1.jpg';

View File

@ -1,11 +1,16 @@
/**
* FaceAPI Demo for NodeJS
* - Analyzes face descriptors from source (image file or folder containing multiple image files)
* - Analyzes face descriptor from target
* - Finds best match
*/
const fs = require('fs'); const fs = require('fs');
const path = require('path'); const path = require('path');
const log = require('@vladmandic/pilogger'); const log = require('@vladmandic/pilogger');
// eslint-disable-next-line import/no-extraneous-dependencies, no-unused-vars, @typescript-eslint/no-unused-vars
const tf = require('@tensorflow/tfjs-node'); // in nodejs environments tfjs-node is required to be loaded before face-api const tf = require('@tensorflow/tfjs-node'); // in nodejs environments tfjs-node is required to be loaded before face-api
// const faceapi = require('@vladmandic/face-api'); // use this when face-api is installed as module (majority of use cases)
const faceapi = require('../dist/face-api.node.js'); // use this when using face-api in dev mode const faceapi = require('../dist/face-api.node.js'); // use this when using face-api in dev mode
// const faceapi = require('@vladmandic/face-api'); // use this when face-api is installed as module (majority of use cases)
let optionsSSDMobileNet; let optionsSSDMobileNet;
const minConfidence = 0.1; const minConfidence = 0.1;
@ -33,6 +38,8 @@ async function getDescriptors(imageFile) {
} }
async function registerImage(inputFile) { async function registerImage(inputFile) {
if (!inputFile.toLowerCase().endsWith('jpg') && !inputFile.toLowerCase().endsWith('png') && !inputFile.toLowerCase().endsWith('gif')) return;
log.data('Registered:', inputFile);
const descriptors = await getDescriptors(inputFile); const descriptors = await getDescriptors(inputFile);
for (const descriptor of descriptors) { for (const descriptor of descriptors) {
const labeledFaceDescriptor = new faceapi.LabeledFaceDescriptors(inputFile, [descriptor]); const labeledFaceDescriptor = new faceapi.LabeledFaceDescriptors(inputFile, [descriptor]);
@ -60,14 +67,18 @@ async function main() {
await initFaceAPI(); await initFaceAPI();
log.info('Input:', process.argv[2]); log.info('Input:', process.argv[2]);
if (fs.statSync(process.argv[2]).isFile()) { if (fs.statSync(process.argv[2]).isFile()) {
await registerImage(process.argv[2]); await registerImage(process.argv[2]); // register image
} else if (fs.statSync(process.argv[2]).isDirectory()) { } else if (fs.statSync(process.argv[2]).isDirectory()) {
const dir = fs.readdirSync(process.argv[2]); const dir = fs.readdirSync(process.argv[2]);
for (const f of dir) await registerImage(path.join(process.argv[2], f)); for (const f of dir) await registerImage(path.join(process.argv[2], f)); // register all images in a folder
}
log.info('Comparing:', process.argv[3], 'Descriptors:', labeledFaceDescriptors.length);
if (labeledFaceDescriptors.length > 0) {
const bestMatch = await findBestMatch(process.argv[3]); // find best match to all registered images
log.data('Match:', bestMatch);
} else {
log.warn('No registered faces');
} }
log.info('Descriptors:', labeledFaceDescriptors.length);
const bestMatch = await findBestMatch(process.argv[3]);
log.data('Match:', bestMatch);
} }
main(); main();

View File

@ -1,14 +1,16 @@
// @ts-nocheck /**
* FaceAPI Demo for NodeJS
* - Used by `node-multiprocess.js`
*/
const fs = require('fs'); const fs = require('fs');
const path = require('path'); const path = require('path');
const log = require('@vladmandic/pilogger'); const log = require('@vladmandic/pilogger');
// workers actual import tfjs and faceapi modules // workers actual import tfjs and faceapi modules
// eslint-disable-next-line import/no-extraneous-dependencies, no-unused-vars, @typescript-eslint/no-unused-vars
const tf = require('@tensorflow/tfjs-node'); // in nodejs environments tfjs-node is required to be loaded before face-api const tf = require('@tensorflow/tfjs-node'); // in nodejs environments tfjs-node is required to be loaded before face-api
// const faceapi = require('@vladmandic/face-api'); // use this when face-api is installed as module (majority of use cases)
const faceapi = require('../dist/face-api.node.js'); // use this when using face-api in dev mode const faceapi = require('../dist/face-api.node.js'); // use this when using face-api in dev mode
// const faceapi = require('@vladmandic/face-api'); // use this when face-api is installed as module (majority of use cases)
// options used by faceapi // options used by faceapi
const modelPathRoot = '../model'; const modelPathRoot = '../model';

View File

@ -1,3 +1,9 @@
/**
* FaceAPI Demo for NodeJS
* - Starts multiple worker processes and uses them as worker pool to process all input images
* - Images are enumerated in main process and sent for processing to worker processes via ipc
*/
const fs = require('fs'); const fs = require('fs');
const path = require('path'); const path = require('path');
const log = require('@vladmandic/pilogger'); // this is my simple logger with few extra features const log = require('@vladmandic/pilogger'); // this is my simple logger with few extra features

View File

@ -1,7 +1,13 @@
/**
* FaceAPI Demo for NodeJS
* - Loads image
* - Outputs results to console
*/
const fs = require('fs'); const fs = require('fs');
// const faceapi = require('@vladmandic/face-api'); // use this when face-api is installed as module (majority of use cases)
const faceapi = require('../dist/face-api.node.js'); // use this when using face-api in dev mode const faceapi = require('../dist/face-api.node.js'); // use this when using face-api in dev mode
// const faceapi = require('@vladmandic/face-api'); // use this when face-api is installed as module (majority of use cases)
async function main() { async function main() {
await faceapi.nets.ssdMobilenetv1.loadFromDisk('model'); // load models from a specific patch await faceapi.nets.ssdMobilenetv1.loadFromDisk('model'); // load models from a specific patch
@ -19,8 +25,7 @@ async function main() {
.withFaceDescriptors() .withFaceDescriptors()
.withAgeAndGender(); .withAgeAndGender();
faceapi.tf.dispose([decodeT, expandT]); // dispose tensors to avoid memory leaks faceapi.tf.dispose([decodeT, expandT]); // dispose tensors to avoid memory leaks
// eslint-disable-next-line no-console console.log({ result }); // eslint-disable-line no-console
console.log({ result }); // print results
} }
main(); main();

53
demo/node-wasm.js Normal file
View File

@ -0,0 +1,53 @@
/**
* FaceAPI Demo for NodeJS using WASM
* - Loads WASM binaries from external CDN
* - Loads image
* - Outputs results to console
*/
const fs = require('fs');
const image = require('@canvas/image'); // eslint-disable-line node/no-missing-require
const tf = require('@tensorflow/tfjs');
const wasm = require('@tensorflow/tfjs-backend-wasm');
const faceapi = require('../dist/face-api.node-wasm.js'); // use this when using face-api in dev mode
async function readImage(imageFile) {
const buffer = fs.readFileSync(imageFile); // read image from disk
const canvas = await image.imageFromBuffer(buffer); // decode to canvas
const imageData = image.getImageData(canvas); // read decoded image data from canvas
const tensor = tf.tidy(() => { // create tensor from image data
const data = tf.tensor(Array.from(imageData?.data || []), [canvas.height, canvas.width, 4], 'int32'); // create rgba image tensor from flat array and flip to height x width
const channels = tf.split(data, 4, 2); // split rgba to channels
const rgb = tf.stack([channels[0], channels[1], channels[2]], 2); // stack channels back to rgb
const squeeze = tf.squeeze(rgb); // move extra dim from the end of tensor and use it as batch number instead
return squeeze;
});
console.log(`Image: ${imageFile} [${canvas.width} x ${canvas.height} Tensor: ${tensor.shape}, Size: ${tensor.size}`); // eslint-disable-line no-console
return tensor;
}
async function main() {
wasm.setWasmPaths('https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-backend-wasm/dist/', true);
await tf.setBackend('wasm');
await tf.ready();
console.log(`Version: FaceAPI ${faceapi.version} TensorFlow/JS ${tf.version_core} Backend: ${faceapi.tf.getBackend()}`); // eslint-disable-line no-console
await faceapi.nets.ssdMobilenetv1.loadFromDisk('model'); // load models from a specific patch
await faceapi.nets.faceLandmark68Net.loadFromDisk('model');
await faceapi.nets.ageGenderNet.loadFromDisk('model');
await faceapi.nets.faceRecognitionNet.loadFromDisk('model');
await faceapi.nets.faceExpressionNet.loadFromDisk('model');
const options = new faceapi.SsdMobilenetv1Options({ minConfidence: 0.1, maxResults: 10 }); // set model options
const tensor = await readImage('demo/sample1.jpg');
const t0 = performance.now();
const result = await faceapi.detectAllFaces(tensor, options) // run detection
.withFaceLandmarks()
.withFaceExpressions()
.withFaceDescriptors()
.withAgeAndGender();
tf.dispose(tensor); // dispose tensors to avoid memory leaks
const t1 = performance.now();
console.log('Time', t1 - t0); // eslint-disable-line no-console
console.log('Result', result); // eslint-disable-line no-console
}
main();

View File

@ -1,12 +1,18 @@
/**
* FaceAPI Demo for NodeJS
* - Uses external library [node-fetch](https://www.npmjs.com/package/node-fetch) to load images via http
* - Loads image from provided param
* - Outputs results to console
*/
const fs = require('fs'); const fs = require('fs');
const process = require('process'); const process = require('process');
const path = require('path'); const path = require('path');
const log = require('@vladmandic/pilogger'); const log = require('@vladmandic/pilogger');
// eslint-disable-next-line import/no-extraneous-dependencies, no-unused-vars, @typescript-eslint/no-unused-vars
const tf = require('@tensorflow/tfjs-node'); // in nodejs environments tfjs-node is required to be loaded before face-api const tf = require('@tensorflow/tfjs-node'); // in nodejs environments tfjs-node is required to be loaded before face-api
// const faceapi = require('@vladmandic/face-api'); // use this when face-api is installed as module (majority of use cases)
const faceapi = require('../dist/face-api.node.js'); // use this when using face-api in dev mode const faceapi = require('../dist/face-api.node.js'); // use this when using face-api in dev mode
// const faceapi = require('@vladmandic/face-api'); // use this when face-api is installed as module (majority of use cases)
const modelPathRoot = '../model'; const modelPathRoot = '../model';
const imgPathRoot = './demo'; // modify to include your sample images const imgPathRoot = './demo'; // modify to include your sample images
@ -87,11 +93,10 @@ async function main() {
log.header(); log.header();
log.info('FaceAPI single-process test'); log.info('FaceAPI single-process test');
fetch = (await import('node-fetch')).default; // eslint-disable-next-line node/no-extraneous-import
fetch = (await import('node-fetch')).default; // eslint-disable-line node/no-missing-import
await faceapi.tf.setBackend('tensorflow'); await faceapi.tf.setBackend('tensorflow');
await faceapi.tf.enableProdMode();
await faceapi.tf.ENV.set('DEBUG', false);
await faceapi.tf.ready(); await faceapi.tf.ready();
log.state(`Version: TensorFlow/JS ${faceapi.tf?.version_core} FaceAPI ${faceapi.version} Backend: ${faceapi.tf?.getBackend()}`); log.state(`Version: TensorFlow/JS ${faceapi.tf?.version_core} FaceAPI ${faceapi.version} Backend: ${faceapi.tf?.getBackend()}`);

View File

@ -1,8 +1,14 @@
import * as faceapi from '../dist/face-api.esm.js'; /**
* FaceAPI Demo for Browsers
* Loaded via `webcam.html`
*/
import * as faceapi from '../dist/face-api.esm.js'; // use when in dev mode
// import * as faceapi from '@vladmandic/face-api'; // use when downloading face-api as npm
// configuration options // configuration options
const modelPath = '../model/'; // path to model folder that will be loaded using http const modelPath = '../model/'; // path to model folder that will be loaded using http
// const modelPath = 'https://vladmandic.github.io/face-api/model/'; // path to model folder that will be loaded using http // const modelPath = 'https://cdn.jsdelivr.net/npm/@vladmandic/face-api/model/'; // path to model folder that will be loaded using http
const minScore = 0.2; // minimum score const minScore = 0.2; // minimum score
const maxResults = 5; // maximum number of results to return const maxResults = 5; // maximum number of results to return
let optionsSSDMobileNet; let optionsSSDMobileNet;
@ -17,15 +23,14 @@ function str(json) {
// helper function to print strings to html document as a log // helper function to print strings to html document as a log
function log(...txt) { function log(...txt) {
// eslint-disable-next-line no-console console.log(...txt); // eslint-disable-line no-console
console.log(...txt);
const div = document.getElementById('log'); const div = document.getElementById('log');
if (div) div.innerHTML += `<br>${txt}`; if (div) div.innerHTML += `<br>${txt}`;
} }
// helper function to draw detected faces // helper function to draw detected faces
function drawFaces(canvas, data, fps) { function drawFaces(canvas, data, fps) {
const ctx = canvas.getContext('2d'); const ctx = canvas.getContext('2d', { willReadFrequently: true });
if (!ctx) return; if (!ctx) return;
ctx.clearRect(0, 0, canvas.width, canvas.height); ctx.clearRect(0, 0, canvas.width, canvas.height);
// draw title // draw title
@ -42,18 +47,18 @@ function drawFaces(canvas, data, fps) {
ctx.rect(person.detection.box.x, person.detection.box.y, person.detection.box.width, person.detection.box.height); ctx.rect(person.detection.box.x, person.detection.box.y, person.detection.box.width, person.detection.box.height);
ctx.stroke(); ctx.stroke();
ctx.globalAlpha = 1; ctx.globalAlpha = 1;
// const expression = person.expressions.sort((a, b) => Object.values(a)[0] - Object.values(b)[0]); // draw text labels
const expression = Object.entries(person.expressions).sort((a, b) => b[1] - a[1]); const expression = Object.entries(person.expressions).sort((a, b) => b[1] - a[1]);
ctx.fillStyle = 'black'; ctx.fillStyle = 'black';
ctx.fillText(`gender: ${Math.round(100 * person.genderProbability)}% ${person.gender}`, person.detection.box.x, person.detection.box.y - 59); ctx.fillText(`gender: ${Math.round(100 * person.genderProbability)}% ${person.gender}`, person.detection.box.x, person.detection.box.y - 59);
ctx.fillText(`expression: ${Math.round(100 * expression[0][1])}% ${expression[0][0]}`, person.detection.box.x, person.detection.box.y - 41); ctx.fillText(`expression: ${Math.round(100 * expression[0][1])}% ${expression[0][0]}`, person.detection.box.x, person.detection.box.y - 41);
ctx.fillText(`age: ${Math.round(person.age)} years`, person.detection.box.x, person.detection.box.y - 23); ctx.fillText(`age: ${Math.round(person.age)} years`, person.detection.box.x, person.detection.box.y - 23);
ctx.fillText(`roll:${person.angle.roll.toFixed(3)} pitch:${person.angle.pitch.toFixed(3)} yaw:${person.angle.yaw.toFixed(3)}`, person.detection.box.x, person.detection.box.y - 5); ctx.fillText(`roll:${person.angle.roll}° pitch:${person.angle.pitch}° yaw:${person.angle.yaw}°`, person.detection.box.x, person.detection.box.y - 5);
ctx.fillStyle = 'lightblue'; ctx.fillStyle = 'lightblue';
ctx.fillText(`gender: ${Math.round(100 * person.genderProbability)}% ${person.gender}`, person.detection.box.x, person.detection.box.y - 60); ctx.fillText(`gender: ${Math.round(100 * person.genderProbability)}% ${person.gender}`, person.detection.box.x, person.detection.box.y - 60);
ctx.fillText(`expression: ${Math.round(100 * expression[0][1])}% ${expression[0][0]}`, person.detection.box.x, person.detection.box.y - 42); ctx.fillText(`expression: ${Math.round(100 * expression[0][1])}% ${expression[0][0]}`, person.detection.box.x, person.detection.box.y - 42);
ctx.fillText(`age: ${Math.round(person.age)} years`, person.detection.box.x, person.detection.box.y - 24); ctx.fillText(`age: ${Math.round(person.age)} years`, person.detection.box.x, person.detection.box.y - 24);
ctx.fillText(`roll:${person.angle.roll.toFixed(3)} pitch:${person.angle.pitch.toFixed(3)} yaw:${person.angle.yaw.toFixed(3)}`, person.detection.box.x, person.detection.box.y - 6); ctx.fillText(`roll:${person.angle.roll}° pitch:${person.angle.pitch}° yaw:${person.angle.yaw}°`, person.detection.box.x, person.detection.box.y - 6);
// draw face points for each face // draw face points for each face
ctx.globalAlpha = 0.8; ctx.globalAlpha = 0.8;
ctx.fillStyle = 'lightblue'; ctx.fillStyle = 'lightblue';
@ -61,7 +66,6 @@ function drawFaces(canvas, data, fps) {
for (let i = 0; i < person.landmarks.positions.length; i++) { for (let i = 0; i < person.landmarks.positions.length; i++) {
ctx.beginPath(); ctx.beginPath();
ctx.arc(person.landmarks.positions[i].x, person.landmarks.positions[i].y, pointSize, 0, 2 * Math.PI); ctx.arc(person.landmarks.positions[i].x, person.landmarks.positions[i].y, pointSize, 0, 2 * Math.PI);
// ctx.fillText(`${i}`, person.landmarks.positions[i].x + 4, person.landmarks.positions[i].y + 4);
ctx.fill(); ctx.fill();
} }
} }
@ -95,7 +99,6 @@ async function setupCamera() {
const canvas = document.getElementById('canvas'); const canvas = document.getElementById('canvas');
if (!video || !canvas) return null; if (!video || !canvas) return null;
let msg = '';
log('Setting up camera'); log('Setting up camera');
// setup webcam. note that navigator.mediaDevices requires that page is accessed via https // setup webcam. note that navigator.mediaDevices requires that page is accessed via https
if (!navigator.mediaDevices) { if (!navigator.mediaDevices) {
@ -103,23 +106,19 @@ async function setupCamera() {
return null; return null;
} }
let stream; let stream;
const constraints = { const constraints = { audio: false, video: { facingMode: 'user', resizeMode: 'crop-and-scale' } };
audio: false,
video: { facingMode: 'user', resizeMode: 'crop-and-scale' },
};
if (window.innerWidth > window.innerHeight) constraints.video.width = { ideal: window.innerWidth }; if (window.innerWidth > window.innerHeight) constraints.video.width = { ideal: window.innerWidth };
else constraints.video.height = { ideal: window.innerHeight }; else constraints.video.height = { ideal: window.innerHeight };
try { try {
stream = await navigator.mediaDevices.getUserMedia(constraints); stream = await navigator.mediaDevices.getUserMedia(constraints);
} catch (err) { } catch (err) {
if (err.name === 'PermissionDeniedError' || err.name === 'NotAllowedError') msg = 'camera permission denied'; if (err.name === 'PermissionDeniedError' || err.name === 'NotAllowedError') log(`Camera Error: camera permission denied: ${err.message || err}`);
else if (err.name === 'SourceUnavailableError') msg = 'camera not available'; if (err.name === 'SourceUnavailableError') log(`Camera Error: camera not available: ${err.message || err}`);
log(`Camera Error: ${msg}: ${err.message || err}`);
return null; return null;
} }
// @ts-ignore if (stream) {
if (stream) video.srcObject = stream; video.srcObject = stream;
else { } else {
log('Camera Error: stream empty'); log('Camera Error: stream empty');
return null; return null;
} }
@ -128,31 +127,23 @@ async function setupCamera() {
if (settings.deviceId) delete settings.deviceId; if (settings.deviceId) delete settings.deviceId;
if (settings.groupId) delete settings.groupId; if (settings.groupId) delete settings.groupId;
if (settings.aspectRatio) settings.aspectRatio = Math.trunc(100 * settings.aspectRatio) / 100; if (settings.aspectRatio) settings.aspectRatio = Math.trunc(100 * settings.aspectRatio) / 100;
log(`Camera active: ${track.label}`); // ${str(constraints)} log(`Camera active: ${track.label}`);
log(`Camera settings: ${str(settings)}`); log(`Camera settings: ${str(settings)}`);
canvas.addEventListener('click', () => { canvas.addEventListener('click', () => {
// @ts-ignore
if (video && video.readyState >= 2) { if (video && video.readyState >= 2) {
// @ts-ignore
if (video.paused) { if (video.paused) {
// @ts-ignore
video.play(); video.play();
detectVideo(video, canvas); detectVideo(video, canvas);
} else { } else {
// @ts-ignore
video.pause(); video.pause();
} }
} }
// @ts-ignore
log(`Camera state: ${video.paused ? 'paused' : 'playing'}`); log(`Camera state: ${video.paused ? 'paused' : 'playing'}`);
}); });
return new Promise((resolve) => { return new Promise((resolve) => {
video.onloadeddata = async () => { video.onloadeddata = async () => {
// @ts-ignore
canvas.width = video.videoWidth; canvas.width = video.videoWidth;
// @ts-ignore
canvas.height = video.videoHeight; canvas.height = video.videoHeight;
// @ts-ignore
video.play(); video.play();
detectVideo(video, canvas); detectVideo(video, canvas);
resolve(true); resolve(true);
@ -170,7 +161,6 @@ async function setupFaceAPI() {
await faceapi.nets.faceRecognitionNet.load(modelPath); await faceapi.nets.faceRecognitionNet.load(modelPath);
await faceapi.nets.faceExpressionNet.load(modelPath); await faceapi.nets.faceExpressionNet.load(modelPath);
optionsSSDMobileNet = new faceapi.SsdMobilenetv1Options({ minConfidence: minScore, maxResults }); optionsSSDMobileNet = new faceapi.SsdMobilenetv1Options({ minConfidence: minScore, maxResults });
// check tf engine state // check tf engine state
log(`Models loaded: ${str(faceapi.tf.engine().state.numTensors)} tensors`); log(`Models loaded: ${str(faceapi.tf.engine().state.numTensors)} tensors`);
} }
@ -180,19 +170,21 @@ async function main() {
log('FaceAPI WebCam Test'); log('FaceAPI WebCam Test');
// if you want to use wasm backend location for wasm binaries must be specified // if you want to use wasm backend location for wasm binaries must be specified
// await faceapi.tf.setWasmPaths('../node_modules/@tensorflow/tfjs-backend-wasm/dist/'); // await faceapi.tf?.setWasmPaths(`https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-backend-wasm@${faceapi.tf.version_core}/dist/`);
// await faceapi.tf.setBackend('wasm'); // await faceapi.tf?.setBackend('wasm');
// log(`WASM SIMD: ${await faceapi.tf?.env().getAsync('WASM_HAS_SIMD_SUPPORT')} Threads: ${await faceapi.tf?.env().getAsync('WASM_HAS_MULTITHREAD_SUPPORT') ? 'Multi' : 'Single'}`);
// default is webgl backend // default is webgl backend
await faceapi.tf.setBackend('webgl'); await faceapi.tf.setBackend('webgl');
await faceapi.tf.enableProdMode();
await faceapi.tf.ENV.set('DEBUG', false);
await faceapi.tf.ready(); await faceapi.tf.ready();
// tfjs optimizations
if (faceapi.tf?.env().flagRegistry.CANVAS2D_WILL_READ_FREQUENTLY) faceapi.tf.env().set('CANVAS2D_WILL_READ_FREQUENTLY', true);
if (faceapi.tf?.env().flagRegistry.WEBGL_EXP_CONV) faceapi.tf.env().set('WEBGL_EXP_CONV', true);
if (faceapi.tf?.env().flagRegistry.WEBGL_EXP_CONV) faceapi.tf.env().set('WEBGL_EXP_CONV', true);
// check version // check version
log(`Version: FaceAPI ${str(faceapi?.version || '(not loaded)')} TensorFlow/JS ${str(faceapi?.tf?.version_core || '(not loaded)')} Backend: ${str(faceapi?.tf?.getBackend() || '(not loaded)')}`); log(`Version: FaceAPI ${str(faceapi?.version || '(not loaded)')} TensorFlow/JS ${str(faceapi.tf?.version_core || '(not loaded)')} Backend: ${str(faceapi.tf?.getBackend() || '(not loaded)')}`);
// log(`Flags: ${JSON.stringify(faceapi?.tf?.ENV.flags || { tf: 'not loaded' })}`);
await setupFaceAPI(); await setupFaceAPI();
await setupCamera(); await setupCamera();

1
dist/face-api.d.ts vendored Normal file
View File

@ -0,0 +1 @@
export * from '../types/face-api';

1
dist/face-api.esm-nobundle.d.ts vendored Normal file
View File

@ -0,0 +1 @@
export * from '../types/face-api';

File diff suppressed because one or more lines are too long

1
dist/face-api.esm.d.ts vendored Normal file
View File

@ -0,0 +1 @@
export * from '../types/face-api';

67532
dist/face-api.esm.js vendored

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

2893
dist/face-api.js vendored

File diff suppressed because one or more lines are too long

1
dist/face-api.node-gpu.d.ts vendored Normal file
View File

@ -0,0 +1 @@
export * from '../types/face-api';

File diff suppressed because one or more lines are too long

1
dist/face-api.node-wasm.d.ts vendored Normal file
View File

@ -0,0 +1 @@
export * from '../types/face-api';

File diff suppressed because one or more lines are too long

1
dist/face-api.node.d.ts vendored Normal file
View File

@ -0,0 +1 @@
export * from '../types/face-api';

4692
dist/face-api.node.js vendored

File diff suppressed because one or more lines are too long

4
dist/tfjs.esm.d.ts vendored
View File

@ -1,3 +1,4 @@
/*
import '@tensorflow/tfjs-core'; import '@tensorflow/tfjs-core';
import '@tensorflow/tfjs-core/dist/types'; import '@tensorflow/tfjs-core/dist/types';
import '@tensorflow/tfjs-core/dist/register_all_gradients'; import '@tensorflow/tfjs-core/dist/register_all_gradients';
@ -9,6 +10,7 @@ import '@tensorflow/tfjs-backend-cpu';
import '@tensorflow/tfjs-backend-webgl'; import '@tensorflow/tfjs-backend-webgl';
import '@tensorflow/tfjs-backend-wasm'; import '@tensorflow/tfjs-backend-wasm';
import '@tensorflow/tfjs-backend-webgpu'; import '@tensorflow/tfjs-backend-webgpu';
*/
export declare const version: { export declare const version: {
'tfjs-core': string; 'tfjs-core': string;
@ -20,7 +22,7 @@ export declare const version: {
tfjs: string; tfjs: string;
}; };
// export { io, browser, image } from '@tensorflow/tfjs-core'; export { io, browser, image } from '@tensorflow/tfjs-core';
export { tensor, tidy, softmax, unstack, relu, add, conv2d, cast, zeros, concat, avgPool, stack, fill, transpose, tensor1d, tensor2d, tensor3d, tensor4d, maxPool, matMul, mul, sub, scalar } from '@tensorflow/tfjs-core'; export { tensor, tidy, softmax, unstack, relu, add, conv2d, cast, zeros, concat, avgPool, stack, fill, transpose, tensor1d, tensor2d, tensor3d, tensor4d, maxPool, matMul, mul, sub, scalar } from '@tensorflow/tfjs-core';
export { div, pad, slice, reshape, slice3d, expandDims, depthwiseConv2d, separableConv2d, sigmoid, exp, tile, batchNorm, clipByValue } from '@tensorflow/tfjs-core'; export { div, pad, slice, reshape, slice3d, expandDims, depthwiseConv2d, separableConv2d, sigmoid, exp, tile, batchNorm, clipByValue } from '@tensorflow/tfjs-core';
export { ENV, Variable, Tensor, TensorLike, Rank, Tensor1D, Tensor2D, Tensor3D, Tensor4D, Tensor5D, NamedTensorMap } from '@tensorflow/tfjs-core'; export { ENV, Variable, Tensor, TensorLike, Rank, Tensor1D, Tensor2D, Tensor3D, Tensor4D, Tensor5D, NamedTensorMap } from '@tensorflow/tfjs-core';

65719
dist/tfjs.esm.js vendored

File diff suppressed because one or more lines are too long

9
dist/tfjs.version.d.ts vendored Normal file
View File

@ -0,0 +1,9 @@
export declare const version: {
'tfjs-core': string;
'tfjs-backend-cpu': string;
'tfjs-backend-webgl': string;
'tfjs-data': string;
'tfjs-layers': string;
'tfjs-converter': string;
tfjs: string;
};

39
dist/tfjs.version.js vendored
View File

@ -4,41 +4,4 @@
author: <https://github.com/vladmandic>' author: <https://github.com/vladmandic>'
*/ */
// node_modules/.pnpm/@tensorflow+tfjs@3.13.0_seedrandom@3.0.5/node_modules/@tensorflow/tfjs/package.json var e="4.22.0";var s="4.22.0";var t="4.22.0";var n="4.22.0";var i="4.22.0";var w={tfjs:e,"tfjs-core":e,"tfjs-converter":s,"tfjs-backend-cpu":t,"tfjs-backend-webgl":n,"tfjs-backend-wasm":i};export{w as version};
var version = "3.13.0";
// node_modules/.pnpm/@tensorflow+tfjs-core@3.13.0/node_modules/@tensorflow/tfjs-core/package.json
var version2 = "3.13.0";
// node_modules/.pnpm/@tensorflow+tfjs-data@3.13.0_dadde02861a8b00ace7633d17571891e/node_modules/@tensorflow/tfjs-data/package.json
var version3 = "3.13.0";
// node_modules/.pnpm/@tensorflow+tfjs-layers@3.13.0_@tensorflow+tfjs-core@3.13.0/node_modules/@tensorflow/tfjs-layers/package.json
var version4 = "3.13.0";
// node_modules/.pnpm/@tensorflow+tfjs-converter@3.13.0_@tensorflow+tfjs-core@3.13.0/node_modules/@tensorflow/tfjs-converter/package.json
var version5 = "3.13.0";
// node_modules/.pnpm/@tensorflow+tfjs-backend-cpu@3.13.0_@tensorflow+tfjs-core@3.13.0/node_modules/@tensorflow/tfjs-backend-cpu/package.json
var version6 = "3.13.0";
// node_modules/.pnpm/@tensorflow+tfjs-backend-webgl@3.13.0_@tensorflow+tfjs-core@3.13.0/node_modules/@tensorflow/tfjs-backend-webgl/package.json
var version7 = "3.13.0";
// node_modules/.pnpm/@tensorflow+tfjs-backend-wasm@3.13.0_@tensorflow+tfjs-core@3.13.0/node_modules/@tensorflow/tfjs-backend-wasm/package.json
var version8 = "3.13.0";
// src/tfjs/tf-version.ts
var version9 = {
tfjs: version,
"tfjs-core": version2,
"tfjs-data": version3,
"tfjs-layers": version4,
"tfjs-converter": version5,
"tfjs-backend-cpu": version6,
"tfjs-backend-webgl": version7,
"tfjs-backend-wasm": version8
};
export {
version9 as version
};

View File

@ -1,6 +1,6 @@
{ {
"name": "@vladmandic/face-api", "name": "@vladmandic/face-api",
"version": "1.6.5", "version": "1.7.15",
"description": "FaceAPI: AI-powered Face Detection & Rotation Tracking, Face Description & Recognition, Age & Gender & Emotion Prediction for Browser and NodeJS using TensorFlow/JS", "description": "FaceAPI: AI-powered Face Detection & Rotation Tracking, Face Description & Recognition, Age & Gender & Emotion Prediction for Browser and NodeJS using TensorFlow/JS",
"sideEffects": false, "sideEffects": false,
"main": "dist/face-api.node.js", "main": "dist/face-api.node.js",
@ -22,8 +22,8 @@
}, },
"scripts": { "scripts": {
"start": "node --no-warnings demo/node.js", "start": "node --no-warnings demo/node.js",
"dev": "build --profile development",
"build": "node build.js", "build": "node build.js",
"dev": "build --profile development",
"lint": "eslint src/ demo/", "lint": "eslint src/ demo/",
"test": "node --trace-warnings test/test-node.js", "test": "node --trace-warnings test/test-node.js",
"scan": "npx auditjs@latest ossi --dev --quiet" "scan": "npx auditjs@latest ossi --dev --quiet"
@ -42,42 +42,38 @@
"tfjs" "tfjs"
], ],
"devDependencies": { "devDependencies": {
"@canvas/image": "^1.0.1", "@canvas/image": "^2.0.0",
"@microsoft/api-extractor": "^7.19.4", "@microsoft/api-extractor": "^7.49.2",
"@tensorflow/tfjs": "^3.13.0", "@tensorflow/tfjs": "^4.22.0",
"@tensorflow/tfjs-backend-cpu": "^3.13.0", "@tensorflow/tfjs-backend-cpu": "^4.22.0",
"@tensorflow/tfjs-backend-wasm": "^3.13.0", "@tensorflow/tfjs-backend-wasm": "^4.22.0",
"@tensorflow/tfjs-backend-webgl": "^3.13.0", "@tensorflow/tfjs-backend-webgl": "^4.22.0",
"@tensorflow/tfjs-backend-webgpu": "^0.0.1-alpha.8", "@tensorflow/tfjs-backend-webgpu": "4.22.0",
"@tensorflow/tfjs-converter": "^3.13.0", "@tensorflow/tfjs-converter": "^4.22.0",
"@tensorflow/tfjs-core": "^3.13.0", "@tensorflow/tfjs-core": "^4.22.0",
"@tensorflow/tfjs-data": "^3.13.0", "@tensorflow/tfjs-data": "^4.22.0",
"@tensorflow/tfjs-layers": "^3.13.0", "@tensorflow/tfjs-layers": "^4.22.0",
"@tensorflow/tfjs-node": "^3.13.0", "@tensorflow/tfjs-node": "^4.22.0",
"@tensorflow/tfjs-node-gpu": "^3.13.0", "@tensorflow/tfjs-node-gpu": "^4.22.0",
"@types/node": "^17.0.15", "@types/node": "^22.13.1",
"@types/offscreencanvas": "^2019.6.4", "@types/offscreencanvas": "^2019.7.3",
"@typescript-eslint/eslint-plugin": "^5.10.2", "@typescript-eslint/eslint-plugin": "^8.5.0",
"@typescript-eslint/parser": "^5.10.2", "@typescript-eslint/parser": "^8.5.0",
"@vladmandic/build": "^0.6.9", "@vladmandic/build": "^0.10.2",
"@vladmandic/pilogger": "^0.4.2", "@vladmandic/pilogger": "^0.5.1",
"@vladmandic/tfjs": "github:vladmandic/tfjs", "ajv": "^8.17.1",
"canvas": "^2.9.0", "esbuild": "^0.24.2",
"chokidar": "^3.5.3", "eslint": "8.57.0",
"dayjs": "^1.10.7",
"esbuild": "^0.14.19",
"eslint": "^8.8.0",
"eslint-config-airbnb-base": "^15.0.0", "eslint-config-airbnb-base": "^15.0.0",
"eslint-plugin-import": "^2.25.4", "eslint-plugin-import": "^2.30.0",
"eslint-plugin-json": "^3.1.0", "eslint-plugin-json": "^4.0.1",
"eslint-plugin-node": "^11.1.0", "eslint-plugin-node": "^11.1.0",
"eslint-plugin-promise": "^6.0.0", "eslint-plugin-promise": "^7.1.0",
"node-fetch": "^3.2.0", "node-fetch": "^3.3.2",
"rimraf": "^3.0.2", "rimraf": "^6.0.1",
"seedrandom": "^3.0.5", "seedrandom": "^3.0.5",
"simple-git": "^3.1.1", "tslib": "^2.8.1",
"tslib": "^2.3.1", "typedoc": "^0.27.6",
"typedoc": "^0.22.11", "typescript": "5.7.3"
"typescript": "4.5.5"
} }
} }

View File

@ -102,8 +102,9 @@ export abstract class NeuralNetwork<TNetParams> {
} }
const { readFile } = env.getEnv(); const { readFile } = env.getEnv();
const { manifestUri, modelBaseUri } = getModelUris(filePath, this.getDefaultModelName()); const { manifestUri, modelBaseUri } = getModelUris(filePath, this.getDefaultModelName());
const fetchWeightsFromDisk = (filePaths: string[]) => Promise.all(filePaths.map((fp) => readFile(fp).then((buf) => buf.buffer))); const fetchWeightsFromDisk = (filePaths: string[]) => Promise.all(filePaths.map((fp) => readFile(fp).then((buf) => (typeof buf === 'string' ? Buffer.from(buf) : buf.buffer))));
const loadWeights = tf.io.weightsLoaderFactory(fetchWeightsFromDisk); // @ts-ignore async-vs-sync mismatch
const loadWeights = tf['io'].weightsLoaderFactory(fetchWeightsFromDisk);
const manifest = JSON.parse((await readFile(manifestUri)).toString()); const manifest = JSON.parse((await readFile(manifestUri)).toString());
const weightMap = await loadWeights(manifest, modelBaseUri); const weightMap = await loadWeights(manifest, modelBaseUri);
this.loadFromWeightMap(weightMap); this.loadFromWeightMap(weightMap);

View File

@ -128,9 +128,7 @@ export class Box<BoxType = any> implements IBoundingBox, IRect {
this.width + padX, this.width + padX,
this.height + padY, this.height + padY,
]; ];
return new Box({ return new Box({ x, y, width, height });
x, y, width, height,
});
} }
public clipAtImageBorders(imgWidth: number, imgHeight: number): Box<BoxType> { public clipAtImageBorders(imgWidth: number, imgHeight: number): Box<BoxType> {
@ -143,9 +141,7 @@ export class Box<BoxType = any> implements IBoundingBox, IRect {
const clippedWidth = Math.min(newWidth, imgWidth - clippedX); const clippedWidth = Math.min(newWidth, imgWidth - clippedX);
const clippedHeight = Math.min(newHeight, imgHeight - clippedY); const clippedHeight = Math.min(newHeight, imgHeight - clippedY);
return (new Box({ return (new Box({ x: clippedX, y: clippedY, width: clippedWidth, height: clippedHeight })).floor();
x: clippedX, y: clippedY, width: clippedWidth, height: clippedHeight,
})).floor();
} }
public shift(sx: number, sy: number): Box<BoxType> { public shift(sx: number, sy: number): Box<BoxType> {
@ -153,9 +149,7 @@ export class Box<BoxType = any> implements IBoundingBox, IRect {
const x = this.x + sx; const x = this.x + sx;
const y = this.y + sy; const y = this.y + sy;
return new Box({ return new Box({ x, y, width, height });
x, y, width, height,
});
} }
public padAtBorders(imageHeight: number, imageWidth: number) { public padAtBorders(imageHeight: number, imageWidth: number) {
@ -189,9 +183,7 @@ export class Box<BoxType = any> implements IBoundingBox, IRect {
y = 1; y = 1;
} }
return { return { dy, edy, dx, edx, y, ey, x, ex, w, h };
dy, edy, dx, edx, y, ey, x, ex, w, h,
};
} }
public calibrate(region: Box) { public calibrate(region: Box) {

View File

@ -131,14 +131,14 @@ export class NetInput {
imgTensor = padToSquare(imgTensor as tf.Tensor4D, isCenterInputs); imgTensor = padToSquare(imgTensor as tf.Tensor4D, isCenterInputs);
if (imgTensor.shape[1] !== inputSize || imgTensor.shape[2] !== inputSize) { if (imgTensor.shape[1] !== inputSize || imgTensor.shape[2] !== inputSize) {
imgTensor = tf.image.resizeBilinear(imgTensor as tf.Tensor4D, [inputSize, inputSize], false, false); imgTensor = tf['image'].resizeBilinear(imgTensor as tf.Tensor4D, [inputSize, inputSize], false, false);
} }
return imgTensor.as3D(inputSize, inputSize, 3); return imgTensor.as3D(inputSize, inputSize, 3);
} }
if (input instanceof env.getEnv().Canvas) { if (input instanceof env.getEnv().Canvas) {
return tf.browser.fromPixels(imageToSquare(input, inputSize, isCenterInputs)); return tf['browser'].fromPixels(imageToSquare(input, inputSize, isCenterInputs));
} }
throw new Error(`toBatchTensor - at batchIdx ${batchIdx}, expected input to be instanceof tf.Tensor or instanceof HTMLCanvasElement, instead have ${input}`); throw new Error(`toBatchTensor - at batchIdx ${batchIdx}, expected input to be instanceof tf.Tensor or instanceof HTMLCanvasElement, instead have ${input}`);

View File

@ -4,7 +4,10 @@ import { isMediaLoaded } from './isMediaLoaded';
export function awaitMediaLoaded(media: HTMLImageElement | HTMLVideoElement | HTMLCanvasElement) { export function awaitMediaLoaded(media: HTMLImageElement | HTMLVideoElement | HTMLCanvasElement) {
// eslint-disable-next-line consistent-return // eslint-disable-next-line consistent-return
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {
if (media instanceof env.getEnv().Canvas || isMediaLoaded(media)) resolve(null); if (media instanceof env.getEnv().Canvas || isMediaLoaded(media)) {
resolve(null);
return;
}
function onError(e: Event) { function onError(e: Event) {
if (!e.currentTarget) return; if (!e.currentTarget) return;

View File

@ -25,17 +25,11 @@ export async function extractFaceTensors(imageTensor: tf.Tensor3D | tf.Tensor4D,
return tf.tidy(() => { return tf.tidy(() => {
const [imgHeight, imgWidth, numChannels] = imageTensor.shape.slice(isTensor4D(imageTensor) ? 1 : 0); const [imgHeight, imgWidth, numChannels] = imageTensor.shape.slice(isTensor4D(imageTensor) ? 1 : 0);
const boxes = detections.map((det) => (det instanceof FaceDetection ? det.forSize(imgWidth, imgHeight).box : det))
const boxes = detections
.map((det) => (det instanceof FaceDetection
? det.forSize(imgWidth, imgHeight).box
: det))
.map((box) => box.clipAtImageBorders(imgWidth, imgHeight)); .map((box) => box.clipAtImageBorders(imgWidth, imgHeight));
const faceTensors = boxes
const faceTensors = boxes.map(({ .filter((box) => box.width > 0 && box.height > 0)
x, y, width, height, .map(({ x, y, width, height }) => tf.slice3d(imageTensor.as3D(imgHeight, imgWidth, numChannels), [y, x, 0], [height, width, numChannels]));
}) => tf.slice3d(imageTensor.as3D(imgHeight, imgWidth, numChannels), [y, x, 0], [height, width, numChannels]));
return faceTensors; return faceTensors;
}); });
} }

View File

@ -3,21 +3,10 @@ import { resolveInput } from './resolveInput';
export function getContext2dOrThrow(canvasArg: string | HTMLCanvasElement | CanvasRenderingContext2D): CanvasRenderingContext2D { export function getContext2dOrThrow(canvasArg: string | HTMLCanvasElement | CanvasRenderingContext2D): CanvasRenderingContext2D {
const { Canvas, CanvasRenderingContext2D } = env.getEnv(); const { Canvas, CanvasRenderingContext2D } = env.getEnv();
if (canvasArg instanceof CanvasRenderingContext2D) return canvasArg;
if (canvasArg instanceof CanvasRenderingContext2D) {
return canvasArg;
}
const canvas = resolveInput(canvasArg); const canvas = resolveInput(canvasArg);
if (!(canvas instanceof Canvas)) throw new Error('resolveContext2d - expected canvas to be of instance of Canvas');
if (!(canvas instanceof Canvas)) { const ctx = canvas.getContext('2d', { willReadFrequently: true });
throw new Error('resolveContext2d - expected canvas to be of instance of Canvas'); if (!ctx) throw new Error('resolveContext2d - canvas 2d context is null');
}
const ctx = canvas.getContext('2d');
if (!ctx) {
throw new Error('resolveContext2d - canvas 2d context is null');
}
return ctx; return ctx;
} }

View File

@ -11,7 +11,7 @@ export async function imageTensorToCanvas(
const [height, width, numChannels] = imgTensor.shape.slice(isTensor4D(imgTensor) ? 1 : 0); const [height, width, numChannels] = imgTensor.shape.slice(isTensor4D(imgTensor) ? 1 : 0);
const imgTensor3D = tf.tidy(() => imgTensor.as3D(height, width, numChannels).toInt()); const imgTensor3D = tf.tidy(() => imgTensor.as3D(height, width, numChannels).toInt());
await tf.browser.toPixels(imgTensor3D, targetCanvas); await tf['browser'].toPixels(imgTensor3D, targetCanvas);
imgTensor3D.dispose(); imgTensor3D.dispose();

View File

@ -8,7 +8,8 @@ export async function loadWeightMap(
defaultModelName: string, defaultModelName: string,
): Promise<tf.NamedTensorMap> { ): Promise<tf.NamedTensorMap> {
const { manifestUri, modelBaseUri } = getModelUris(uri, defaultModelName); const { manifestUri, modelBaseUri } = getModelUris(uri, defaultModelName);
// @ts-ignore
const manifest = await fetchJson<tf.io.WeightsManifestConfig>(manifestUri); const manifest = await fetchJson<tf.io.WeightsManifestConfig>(manifestUri);
// if (manifest['weightsManifest']) manifest = manifest['weightsManifest']; // if (manifest['weightsManifest']) manifest = manifest['weightsManifest'];
return tf.io.loadWeights(manifest, modelBaseUri); return tf['io'].loadWeights(manifest, modelBaseUri);
} }

View File

@ -1,11 +1,9 @@
import * as tf from '../../dist/tfjs.esm'; import type { Tensor3D, Tensor4D } from '../../dist/tfjs.esm';
import { NetInput } from './NetInput'; import { NetInput } from './NetInput';
export type TMediaElement = HTMLImageElement | HTMLVideoElement | HTMLCanvasElement export type TMediaElement = HTMLImageElement | HTMLVideoElement | HTMLCanvasElement
export type TResolvedNetInput = TMediaElement | tf.Tensor3D | tf.Tensor4D export type TResolvedNetInput = TMediaElement | Tensor3D | Tensor4D
export type TNetInputArg = string | TResolvedNetInput export type TNetInput = string | TResolvedNetInput | Array<string | TResolvedNetInput> | NetInput
export type TNetInput = TNetInputArg | Array<TNetInputArg> | NetInput | tf.Tensor4D

View File

@ -5,7 +5,7 @@ export function createFileSystem(fs?: any): FileSystem {
let requireFsError = ''; let requireFsError = '';
if (!fs && isNodejs()) { if (!fs && isNodejs()) {
try { try {
// eslint-disable-next-line global-require // eslint-disable-next-line global-require, @typescript-eslint/no-require-imports
fs = require('fs'); fs = require('fs');
} catch (err) { } catch (err) {
requireFsError = (err as any).toString(); requireFsError = (err as any).toString();
@ -13,7 +13,8 @@ export function createFileSystem(fs?: any): FileSystem {
} }
const readFile = fs const readFile = fs
? (filePath: string) => new Promise((resolve, reject) => { fs.readFile(filePath, (err: any, buffer) => (err ? reject(err) : resolve(buffer))); }) // eslint-disable-next-line no-undef
? (filePath: string) => new Promise<string | Buffer>((resolve, reject) => { fs.readFile(filePath, (err: NodeJS.ErrnoException | null, buffer: string | Buffer) => (err ? reject(err) : resolve(buffer))); })
: () => { throw new Error(`readFile - failed to require fs in nodejs environment with error: ${requireFsError}`); }; : () => { throw new Error(`readFile - failed to require fs in nodejs environment with error: ${requireFsError}`); };
return { readFile }; return { readFile };
} }

View File

@ -3,11 +3,9 @@ import { createFileSystem } from './createFileSystem';
import { Environment } from './types'; import { Environment } from './types';
export function createNodejsEnv(): Environment { export function createNodejsEnv(): Environment {
// eslint-disable-next-line dot-notation const Canvas: (new () => HTMLCanvasElement) = (global as any)['Canvas'] || global.HTMLCanvasElement;
const Canvas = global['Canvas'] || global.HTMLCanvasElement;
const Image = global.Image || global.HTMLImageElement; const Image = global.Image || global.HTMLImageElement;
// eslint-disable-next-line dot-notation const Video: (new () => HTMLVideoElement) = (global as any)['Video'] || global.HTMLVideoElement;
const Video = global['Video'] || global.HTMLVideoElement;
const createCanvasElement = () => { const createCanvasElement = () => {
if (Canvas) return new Canvas(); if (Canvas) return new Canvas();

24
src/env/types.ts vendored
View File

@ -1,17 +1,17 @@
export type FileSystem = { export type FileSystem = {
// eslint-disable-next-line no-unused-vars // eslint-disable-next-line no-unused-vars
readFile: (filePath: string) => Promise<any> readFile: (filePath: string) => Promise<string | Buffer>;
} };
export type Environment = FileSystem & { export type Environment = FileSystem & {
Canvas: typeof HTMLCanvasElement Canvas: typeof HTMLCanvasElement;
CanvasRenderingContext2D: typeof CanvasRenderingContext2D CanvasRenderingContext2D: typeof CanvasRenderingContext2D;
Image: typeof HTMLImageElement Image: typeof HTMLImageElement;
ImageData: typeof ImageData ImageData: typeof ImageData;
Video: typeof HTMLVideoElement Video: typeof HTMLVideoElement;
createCanvasElement: () => HTMLCanvasElement createCanvasElement: () => HTMLCanvasElement;
createImageElement: () => HTMLImageElement createImageElement: () => HTMLImageElement;
createVideoElement: () => HTMLVideoElement createVideoElement: () => HTMLVideoElement;
// eslint-disable-next-line no-undef, no-unused-vars // eslint-disable-next-line no-undef, no-unused-vars
fetch: (url: string, init?: RequestInit) => Promise<Response> fetch: (url: string, init?: RequestInit) => Promise<Response>;
} };

View File

@ -1,12 +1,10 @@
export function euclideanDistance(arr1: number[] | Float32Array, arr2: number[] | Float32Array) { export function euclideanDistance(arr1: number[] | Float32Array, arr2: number[] | Float32Array) {
if (arr1.length !== arr2.length) throw new Error('euclideanDistance: arr1.length !== arr2.length'); if (arr1.length !== arr2.length) throw new Error('euclideanDistance: arr1.length !== arr2.length');
const desc1 = Array.from(arr1); const desc1 = Array.from(arr1);
const desc2 = Array.from(arr2); const desc2 = Array.from(arr2);
return Math.sqrt( return Math.sqrt(
desc1 desc1
.map((val, i) => val - desc2[i]) .map((val, i) => val - desc2[i])
.reduce((res, diff) => res + (diff ** 2), 0), .reduce((res, diff) => res + (diff * diff), 0),
); );
} }

View File

@ -1,4 +1,4 @@
export const FACE_EXPRESSION_LABELS = ['neutral', 'happy', 'sad', 'angry', 'fearful', 'disgusted', 'surprised']; export const FACE_EXPRESSION_LABELS = ['neutral', 'happy', 'sad', 'angry', 'fearful', 'disgusted', 'surprised'] as const;
export class FaceExpressions { export class FaceExpressions {
public neutral = 0; public neutral = 0;

View File

@ -1,3 +1,4 @@
import { Point } from '../classes';
import { FaceDetection } from '../classes/FaceDetection'; import { FaceDetection } from '../classes/FaceDetection';
import { FaceLandmarks } from '../classes/FaceLandmarks'; import { FaceLandmarks } from '../classes/FaceLandmarks';
import { FaceLandmarks68 } from '../classes/FaceLandmarks68'; import { FaceLandmarks68 } from '../classes/FaceLandmarks68';
@ -5,75 +6,106 @@ import { isWithFaceDetection, WithFaceDetection } from './WithFaceDetection';
export type WithFaceLandmarks< export type WithFaceLandmarks<
TSource extends WithFaceDetection<{}>, TSource extends WithFaceDetection<{}>,
TFaceLandmarks extends FaceLandmarks = FaceLandmarks68 > = TSource & { TFaceLandmarks extends FaceLandmarks = FaceLandmarks68
landmarks: TFaceLandmarks, > = TSource & {
unshiftedLandmarks: TFaceLandmarks, landmarks: TFaceLandmarks;
alignedRect: FaceDetection, unshiftedLandmarks: TFaceLandmarks;
angle: { roll: number | undefined, pitch: number | undefined, yaw: number | undefined }, alignedRect: FaceDetection;
} angle: {
roll: number | undefined;
pitch: number | undefined;
yaw: number | undefined;
};
};
export function isWithFaceLandmarks(obj: any): obj is WithFaceLandmarks<WithFaceDetection<{}>, FaceLandmarks> { export function isWithFaceLandmarks(
return isWithFaceDetection(obj) obj: any,
// eslint-disable-next-line dot-notation ): obj is WithFaceLandmarks<WithFaceDetection<{}>, FaceLandmarks> {
&& obj['landmarks'] instanceof FaceLandmarks return (
// eslint-disable-next-line dot-notation isWithFaceDetection(obj)
&& obj['unshiftedLandmarks'] instanceof FaceLandmarks && (obj as any)['landmarks'] instanceof FaceLandmarks
// eslint-disable-next-line dot-notation && (obj as any)['unshiftedLandmarks'] instanceof FaceLandmarks
&& obj['alignedRect'] instanceof FaceDetection; && (obj as any)['alignedRect'] instanceof FaceDetection
);
} }
function calculateFaceAngle(mesh) { function calculateFaceAngle(mesh: FaceLandmarks) {
// returns the angle in the plane (in radians) between the positive x-axis and the ray from (0,0) to the point (x,y) // Helper to convert radians to degrees
const radians = (a1, a2, b1, b2) => (Math.atan2(b2 - a2, b1 - a1) % Math.PI);
// convert radians to degrees
// eslint-disable-next-line no-unused-vars, @typescript-eslint/no-unused-vars // eslint-disable-next-line no-unused-vars, @typescript-eslint/no-unused-vars
const degrees = (theta) => (theta * 180) / Math.PI; const degrees = (radians: number) => (radians * 180) / Math.PI;
const calcLengthBetweenTwoPoints = (a: Point, b: Point) => Math.sqrt((a.x - b.x) ** 2 + (a.y - b.y) ** 2);
const angle = { roll: <number | undefined>undefined, pitch: <number | undefined>undefined, yaw: <number | undefined>undefined }; const angle = {
roll: <number | undefined>undefined,
pitch: <number | undefined>undefined,
yaw: <number | undefined>undefined,
};
if (!mesh || !mesh._positions || mesh._positions.length !== 68) return angle; const calcYaw = (leftPoint: Point, midPoint: Point, rightPoint: Point) => {
const pt = mesh._positions; // Calc x-distance from left side of the face ("ear") to facial midpoint ("nose")
const leftToMidpoint = Math.floor(leftPoint.x - midPoint.x);
// Calc x-distance from facial midpoint ("nose") to the right side of the face ("ear")
const rightToMidpoint = Math.floor(midPoint.x - rightPoint.x);
// Difference in distances coincidentally approximates to angles
return leftToMidpoint - rightToMidpoint;
};
// values are in radians in range of -pi/2 to pi/2 which is -90 to +90 degrees const calcRoll = (lever: Point, pivot: Point) => {
// value of 0 means center // When rolling, the head seems to pivot from the nose/lips/chin area.
// So, we'll choose any two points from the facial midline, where the first point should be the pivot, and the other "lever"
// Plan/Execution: get the hypotenuse & opposite sides of a 90deg triangle ==> Calculate angle in radians
const hypotenuse = Math.hypot(pivot.x - lever.x, pivot.y - lever.y);
const opposite = pivot.y - lever.y;
const angleInRadians = Math.asin(opposite / hypotenuse);
const angleInDegrees = degrees(angleInRadians);
const normalizeAngle = Math.floor(90 - angleInDegrees);
// If lever more to the left of the pivot, then we're tilting left
// "-" is negative direction. "+", or absence of a sign is positive direction
const tiltDirection = pivot.x - lever.x < 0 ? -1 : 1;
const result = normalizeAngle * tiltDirection;
return result;
};
// roll is face lean from left to right const calcPitch = (leftPoint: Point, midPoint: Point, rightPoint: Point) => {
// comparing x,y of outside corners of leftEye and rightEye // Theory: While pitching, the nose is the most salient point --> That's what we'll use to make a trianle.
angle.roll = -radians(pt[36]._x, pt[36]._y, pt[45]._x, pt[45]._y); // The "base" is between point that don't move when we pitch our head (i.e. an imaginary line running ear to ear through the nose).
// Executuin: Get the opposite & adjacent lengths of the triangle from the ear's perspective. Use it to get angle.
// pitch is face turn from left right const base = calcLengthBetweenTwoPoints(leftPoint, rightPoint);
// comparing x distance of top of nose to left and right edge of face // adjecent is base/2 technically.
// precision is lacking since coordinates are not precise enough const baseCoords = new Point((leftPoint.x + rightPoint.x) / 2, (leftPoint.y + rightPoint.y) / 2);
angle.pitch = radians(0, Math.abs(pt[0]._x - pt[30]._x) / pt[30]._x, Math.PI, Math.abs(pt[16]._x - pt[30]._x) / pt[30]._x); const midToBaseLength = calcLengthBetweenTwoPoints(midPoint, baseCoords);
const angleInRadians = Math.atan(midToBaseLength / base);
// yaw is face move from up to down const angleInDegrees = Math.floor(degrees(angleInRadians));
// comparing size of the box around the face with top and bottom of detected landmarks // Account for directionality.
// silly hack, but this gives us face compression on y-axis // pitch forwards (_i.e. tilting your head forwards) is positive (or no sign); backward is negative.
// e.g., tilting head up hides the forehead that doesn't have any landmarks so ratio drops const direction = baseCoords.y - midPoint.y < 0 ? -1 : 1;
const bottom = pt.reduce((prev, cur) => (prev < cur._y ? prev : cur._y), +Infinity); const result = angleInDegrees * direction;
const top = pt.reduce((prev, cur) => (prev > cur._y ? prev : cur._y), -Infinity); return result;
angle.yaw = Math.PI * (mesh._imgDims._height / (top - bottom) / 1.40 - 1); };
if (!mesh || !mesh.positions || mesh.positions.length !== 68) return angle;
const pt = mesh.positions;
angle.roll = calcRoll(pt[27], pt[66]);
angle.pitch = calcPitch(pt[14], pt[30], pt[2]);
angle.yaw = calcYaw(pt[14], pt[33], pt[2]);
return angle; return angle;
} }
export function extendWithFaceLandmarks< export function extendWithFaceLandmarks<TSource extends WithFaceDetection<{}>, TFaceLandmarks extends FaceLandmarks = FaceLandmarks68>(
TSource extends WithFaceDetection<{}>, sourceObj: TSource,
TFaceLandmarks extends FaceLandmarks = FaceLandmarks68 >(sourceObj: TSource, unshiftedLandmarks: TFaceLandmarks): WithFaceLandmarks<TSource, TFaceLandmarks> { unshiftedLandmarks: TFaceLandmarks,
): WithFaceLandmarks<TSource, TFaceLandmarks> {
const { box: shift } = sourceObj.detection; const { box: shift } = sourceObj.detection;
const landmarks = unshiftedLandmarks.shiftBy<TFaceLandmarks>(shift.x, shift.y); const landmarks = unshiftedLandmarks.shiftBy<TFaceLandmarks>(shift.x, shift.y);
const rect = landmarks.align(); const rect = landmarks.align();
const { imageDims } = sourceObj.detection; const { imageDims } = sourceObj.detection;
const alignedRect = new FaceDetection(sourceObj.detection.score, rect.rescale(imageDims.reverse()), imageDims); const alignedRect = new FaceDetection(
sourceObj.detection.score,
rect.rescale(imageDims.reverse()),
imageDims,
);
const angle = calculateFaceAngle(unshiftedLandmarks); const angle = calculateFaceAngle(unshiftedLandmarks);
const extension = { landmarks, unshiftedLandmarks, alignedRect, angle };
const extension = {
landmarks,
unshiftedLandmarks,
alignedRect,
angle,
};
return { ...sourceObj, ...extension }; return { ...sourceObj, ...extension };
} }

View File

@ -39,11 +39,12 @@ export class DetectAllFaceLandmarksTask<TSource extends WithFaceDetection<{}>> e
const faces: Array<HTMLCanvasElement | tf.Tensor3D> = this.input instanceof tf.Tensor const faces: Array<HTMLCanvasElement | tf.Tensor3D> = this.input instanceof tf.Tensor
? await extractFaceTensors(this.input, detections) ? await extractFaceTensors(this.input, detections)
: await extractFaces(this.input, detections); : await extractFaces(this.input, detections);
const faceLandmarksByFace = await Promise.all( const faceLandmarksByFace = await Promise.all(faces.map((face) => this.landmarkNet.detectLandmarks(face))) as FaceLandmarks68[];
faces.map((face) => this.landmarkNet.detectLandmarks(face)),
) as FaceLandmarks68[];
faces.forEach((f) => f instanceof tf.Tensor && f.dispose()); faces.forEach((f) => f instanceof tf.Tensor && f.dispose());
return parentResults.map((parentResult, i) => extendWithFaceLandmarks<TSource>(parentResult, faceLandmarksByFace[i])); const result = parentResults
.filter((_parentResult, i) => faceLandmarksByFace[i])
.map((parentResult, i) => extendWithFaceLandmarks<TSource>(parentResult, faceLandmarksByFace[i]));
return result;
} }
withFaceExpressions() { withFaceExpressions() {

View File

@ -1,4 +0,0 @@
/** Creates tfjs bundle used by Human browser build target
* @external
*/
export * from '@vladmandic/tfjs';

View File

@ -1,4 +0,0 @@
/* eslint-disable import/no-extraneous-dependencies */
/* eslint-disable node/no-unpublished-import */
export * from '@tensorflow/tfjs';

View File

@ -1,4 +1,2 @@
/* eslint-disable import/no-extraneous-dependencies */
/* eslint-disable node/no-unpublished-import */
export * from '@tensorflow/tfjs-node-gpu'; export * from '@tensorflow/tfjs-node-gpu';
export { version } from '../../dist/tfjs.version.js';

View File

@ -1,5 +1,3 @@
/* eslint-disable import/no-extraneous-dependencies */
/* eslint-disable node/no-unpublished-import */
export * from '@tensorflow/tfjs'; export * from '@tensorflow/tfjs';
export * from '@tensorflow/tfjs-backend-wasm'; export * from '@tensorflow/tfjs-backend-wasm';
export { version } from '../../dist/tfjs.version.js';

View File

@ -1,4 +1,2 @@
/* eslint-disable import/no-extraneous-dependencies */
/* eslint-disable node/no-unpublished-import */
export * from '@tensorflow/tfjs-node'; export * from '@tensorflow/tfjs-node';
export { version } from '../../dist/tfjs.version.js';

View File

@ -1,18 +1,19 @@
// get versions of all packages // get versions of all packages
import { version as tfjsVersion } from '@tensorflow/tfjs/package.json'; // import { version as tfjsVersion } from '@tensorflow/tfjs/package.json';
import { version as tfjsCoreVersion } from '@tensorflow/tfjs-core/package.json'; import { version as tfjsCoreVersion } from '@tensorflow/tfjs-core/package.json';
import { version as tfjsDataVersion } from '@tensorflow/tfjs-data/package.json'; // import { version as tfjsDataVersion } from '@tensorflow/tfjs-data/package.json';
import { version as tfjsLayersVersion } from '@tensorflow/tfjs-layers/package.json'; // import { version as tfjsLayersVersion } from '@tensorflow/tfjs-layers/package.json';
import { version as tfjsConverterVersion } from '@tensorflow/tfjs-converter/package.json'; import { version as tfjsConverterVersion } from '@tensorflow/tfjs-converter/package.json';
import { version as tfjsBackendCPUVersion } from '@tensorflow/tfjs-backend-cpu/package.json'; import { version as tfjsBackendCPUVersion } from '@tensorflow/tfjs-backend-cpu/package.json';
import { version as tfjsBackendWebGLVersion } from '@tensorflow/tfjs-backend-webgl/package.json'; import { version as tfjsBackendWebGLVersion } from '@tensorflow/tfjs-backend-webgl/package.json';
import { version as tfjsBackendWASMVersion } from '@tensorflow/tfjs-backend-wasm/package.json'; import { version as tfjsBackendWASMVersion } from '@tensorflow/tfjs-backend-wasm/package.json';
export const version = { export const version = {
tfjs: tfjsVersion, // tfjs: tfjsVersion,
tfjs: tfjsCoreVersion,
'tfjs-core': tfjsCoreVersion, 'tfjs-core': tfjsCoreVersion,
'tfjs-data': tfjsDataVersion, // 'tfjs-data': tfjsDataVersion,
'tfjs-layers': tfjsLayersVersion, // 'tfjs-layers': tfjsLayersVersion,
'tfjs-converter': tfjsConverterVersion, 'tfjs-converter': tfjsConverterVersion,
'tfjs-backend-cpu': tfjsBackendCPUVersion, 'tfjs-backend-cpu': tfjsBackendCPUVersion,
'tfjs-backend-webgl': tfjsBackendWebGLVersion, 'tfjs-backend-webgl': tfjsBackendWebGLVersion,

View File

@ -19,7 +19,7 @@ import { extractParams } from './extractParams';
import { extractParamsFromWeightMap } from './extractParamsFromWeightMap'; import { extractParamsFromWeightMap } from './extractParamsFromWeightMap';
import { leaky } from './leaky'; import { leaky } from './leaky';
import { ITinyYolov2Options, TinyYolov2Options } from './TinyYolov2Options'; import { ITinyYolov2Options, TinyYolov2Options } from './TinyYolov2Options';
import { DefaultTinyYolov2NetParams, MobilenetParams, TinyYolov2NetParams } from './types'; import { DefaultTinyYolov2NetParams, MobilenetParams, TinyYolov2ExtractBoxesResult, TinyYolov2NetParams } from './types';
export class TinyYolov2Base extends NeuralNetwork<TinyYolov2NetParams> { export class TinyYolov2Base extends NeuralNetwork<TinyYolov2NetParams> {
public static DEFAULT_FILTER_SIZES = [3, 16, 32, 64, 128, 256, 512, 1024, 1024]; public static DEFAULT_FILTER_SIZES = [3, 16, 32, 64, 128, 256, 512, 1024, 1024];
@ -183,9 +183,9 @@ export class TinyYolov2Base extends NeuralNetwork<TinyYolov2NetParams> {
return [boxes, scores, classScores]; return [boxes, scores, classScores];
}); });
const results = [] as any; const results: TinyYolov2ExtractBoxesResult[] = [];
const scoresData = await scoresTensor.array(); const scoresData = await scoresTensor.array() as number[][][][];
const boxesData = await boxesTensor.array(); const boxesData = await boxesTensor.array() as number[][][][];
for (let row = 0; row < numCells; row++) { for (let row = 0; row < numCells; row++) {
for (let col = 0; col < numCells; col++) { for (let col = 0; col < numCells; col++) {
for (let anchor = 0; anchor < numBoxes; anchor++) { for (let anchor = 0; anchor < numBoxes; anchor++) {

View File

@ -1,4 +1,5 @@
import * as tf from '../../dist/tfjs.esm'; import * as tf from '../../dist/tfjs.esm';
import { BoundingBox } from '../classes';
import { ConvParams } from '../common/index'; import { ConvParams } from '../common/index';
import { SeparableConvParams } from '../common/types'; import { SeparableConvParams } from '../common/types';
@ -38,3 +39,13 @@ export type DefaultTinyYolov2NetParams = {
} }
export type TinyYolov2NetParams = DefaultTinyYolov2NetParams | MobilenetParams export type TinyYolov2NetParams = DefaultTinyYolov2NetParams | MobilenetParams
export type TinyYolov2ExtractBoxesResult = {
box: BoundingBox;
score: number;
classScore: number;
label: number;
row: number;
col: number;
anchor: number;
}

51
src/types/eslint.json Normal file
View File

@ -0,0 +1,51 @@
{
"globals": {},
"env": {
"browser": true,
"commonjs": true,
"node": true,
"es2022": true
},
"parser": "@typescript-eslint/parser",
"parserOptions": {
"ecmaVersion": "latest"
},
"plugins": [
"@typescript-eslint"
],
"extends": [
"airbnb-base",
"eslint:recommended",
"plugin:@typescript-eslint/eslint-recommended",
"plugin:@typescript-eslint/recommended"
],
"rules": {
"max-len": [1, 1000, 3],
"@typescript-eslint/ban-types":"off",
"@typescript-eslint/no-empty-interface":"off",
"@typescript-eslint/no-explicit-any":"off",
"@typescript-eslint/no-extraneous-class":"off",
"@typescript-eslint/no-unnecessary-type-constraint":"off",
"@typescript-eslint/no-unused-vars":"off",
"@typescript-eslint/prefer-function-type":"off",
"@typescript-eslint/unified-signatures":"off",
"@typescript-eslint/triple-slash-reference":"off",
"camelcase":"off",
"comma-dangle":"off",
"import/extensions":"off",
"import/no-duplicates":"off",
"import/no-mutable-exports":"off",
"import/no-unresolved":"off",
"indent":"off",
"lines-between-class-members":"off",
"max-classes-per-file":"off",
"no-multiple-empty-lines":"off",
"no-shadow":"off",
"no-tabs":"off",
"no-underscore-dangle":"off",
"no-use-before-define":"off",
"object-curly-newline":"off",
"quotes":"off",
"semi":"off"
}
}

392
src/types/long.d.ts vendored Normal file
View File

@ -0,0 +1,392 @@
/* eslint-disable */
// Type definitions for long.js 4.0.0
// Project: https://github.com/dcodeIO/long.js
// Definitions by: Peter Kooijmans <https://github.com/peterkooijmans>
// Definitions: https://github.com/DefinitelyTyped/DefinitelyTyped
// Definitions by: Denis Cappellin <https://github.com/cappellin>
export = Long;
export as namespace Long;
declare const Long: Long.LongConstructor;
type Long = Long.Long;
declare namespace Long {
interface LongConstructor {
/**
* Constructs a 64 bit two's-complement integer, given its low and high 32 bit values as signed integers. See the from* functions below for more convenient ways of constructing Longs.
*/
new(low: number, high?: number, unsigned?: boolean): Long;
prototype: Long;
/**
* Maximum unsigned value.
*/
MAX_UNSIGNED_VALUE: Long;
/**
* Maximum signed value.
*/
MAX_VALUE: Long;
/**
* Minimum signed value.
*/
MIN_VALUE: Long;
/**
* Signed negative one.
*/
NEG_ONE: Long;
/**
* Signed one.
*/
ONE: Long;
/**
* Unsigned one.
*/
UONE: Long;
/**
* Unsigned zero.
*/
UZERO: Long;
/**
* Signed zero
*/
ZERO: Long;
/**
* Returns a Long representing the 64 bit integer that comes by concatenating the given low and high bits. Each is assumed to use 32 bits.
*/
fromBits(lowBits:number, highBits:number, unsigned?:boolean): Long;
/**
* Returns a Long representing the given 32 bit integer value.
*/
fromInt(value: number, unsigned?: boolean): Long;
/**
* Returns a Long representing the given value, provided that it is a finite number. Otherwise, zero is returned.
*/
fromNumber(value: number, unsigned?: boolean): Long;
/**
* Returns a Long representation of the given string, written using the specified radix.
*/
fromString(str: string, unsigned?: boolean | number, radix?: number): Long;
/**
* Creates a Long from its byte representation.
*/
fromBytes(bytes: number[], unsigned?: boolean, le?: boolean): Long;
/**
* Creates a Long from its little endian byte representation.
*/
fromBytesLE(bytes: number[], unsigned?: boolean): Long;
/**
* Creates a Long from its little endian byte representation.
*/
fromBytesBE(bytes: number[], unsigned?: boolean): Long;
/**
* Tests if the specified object is a Long.
*/
// eslint-disable-next-line @typescript-eslint/no-explicit-any
isLong(obj: any): obj is Long;
/**
* Converts the specified value to a Long.
*/
fromValue(val: Long | number | string | {low: number, high: number, unsigned: boolean}): Long;
}
interface Long
{
/**
* The high 32 bits as a signed value.
*/
high: number;
/**
* The low 32 bits as a signed value.
*/
low: number;
/**
* Whether unsigned or not.
*/
unsigned: boolean;
/**
* Returns the sum of this and the specified Long.
*/
add(addend: number | Long | string): Long;
/**
* Returns the bitwise AND of this Long and the specified.
*/
and(other: Long | number | string): Long;
/**
* Compares this Long's value with the specified's.
*/
compare(other: Long | number | string): number;
/**
* Compares this Long's value with the specified's.
*/
comp(other: Long | number | string): number;
/**
* Returns this Long divided by the specified.
*/
divide(divisor: Long | number | string): Long;
/**
* Returns this Long divided by the specified.
*/
div(divisor: Long | number | string): Long;
/**
* Tests if this Long's value equals the specified's.
*/
equals(other: Long | number | string): boolean;
/**
* Tests if this Long's value equals the specified's.
*/
eq(other: Long | number | string): boolean;
/**
* Gets the high 32 bits as a signed integer.
*/
getHighBits(): number;
/**
* Gets the high 32 bits as an unsigned integer.
*/
getHighBitsUnsigned(): number;
/**
* Gets the low 32 bits as a signed integer.
*/
getLowBits(): number;
/**
* Gets the low 32 bits as an unsigned integer.
*/
getLowBitsUnsigned(): number;
/**
* Gets the number of bits needed to represent the absolute value of this Long.
*/
getNumBitsAbs(): number;
/**
* Tests if this Long's value is greater than the specified's.
*/
greaterThan(other: Long | number | string): boolean;
/**
* Tests if this Long's value is greater than the specified's.
*/
gt(other: Long | number | string): boolean;
/**
* Tests if this Long's value is greater than or equal the specified's.
*/
greaterThanOrEqual(other: Long | number | string): boolean;
/**
* Tests if this Long's value is greater than or equal the specified's.
*/
gte(other: Long | number | string): boolean;
/**
* Tests if this Long's value is even.
*/
isEven(): boolean;
/**
* Tests if this Long's value is negative.
*/
isNegative(): boolean;
/**
* Tests if this Long's value is odd.
*/
isOdd(): boolean;
/**
* Tests if this Long's value is positive.
*/
isPositive(): boolean;
/**
* Tests if this Long's value equals zero.
*/
isZero(): boolean;
/**
* Tests if this Long's value is less than the specified's.
*/
lessThan(other: Long | number | string): boolean;
/**
* Tests if this Long's value is less than the specified's.
*/
lt(other: Long | number | string): boolean;
/**
* Tests if this Long's value is less than or equal the specified's.
*/
lessThanOrEqual(other: Long | number | string): boolean;
/**
* Tests if this Long's value is less than or equal the specified's.
*/
lte(other: Long | number | string): boolean;
/**
* Returns this Long modulo the specified.
*/
modulo(other: Long | number | string): Long;
/**
* Returns this Long modulo the specified.
*/
mod(other: Long | number | string): Long;
/**
* Returns the product of this and the specified Long.
*/
multiply(multiplier: Long | number | string): Long;
/**
* Returns the product of this and the specified Long.
*/
mul(multiplier: Long | number | string): Long;
/**
* Negates this Long's value.
*/
negate(): Long;
/**
* Negates this Long's value.
*/
neg(): Long;
/**
* Returns the bitwise NOT of this Long.
*/
not(): Long;
/**
* Tests if this Long's value differs from the specified's.
*/
notEquals(other: Long | number | string): boolean;
/**
* Tests if this Long's value differs from the specified's.
*/
neq(other: Long | number | string): boolean;
/**
* Returns the bitwise OR of this Long and the specified.
*/
or(other: Long | number | string): Long;
/**
* Returns this Long with bits shifted to the left by the given amount.
*/
shiftLeft(numBits: number | Long): Long;
/**
* Returns this Long with bits shifted to the left by the given amount.
*/
shl(numBits: number | Long): Long;
/**
* Returns this Long with bits arithmetically shifted to the right by the given amount.
*/
shiftRight(numBits: number | Long): Long;
/**
* Returns this Long with bits arithmetically shifted to the right by the given amount.
*/
shr(numBits: number | Long): Long;
/**
* Returns this Long with bits logically shifted to the right by the given amount.
*/
shiftRightUnsigned(numBits: number | Long): Long;
/**
* Returns this Long with bits logically shifted to the right by the given amount.
*/
shru(numBits: number | Long): Long;
/**
* Returns the difference of this and the specified Long.
*/
subtract(subtrahend: number | Long | string): Long;
/**
* Returns the difference of this and the specified Long.
*/
sub(subtrahend: number | Long |string): Long;
/**
* Converts the Long to a 32 bit integer, assuming it is a 32 bit integer.
*/
toInt(): number;
/**
* Converts the Long to a the nearest floating-point representation of this value (double, 53 bit mantissa).
*/
toNumber(): number;
/**
* Converts this Long to its byte representation.
*/
toBytes(le?: boolean): number[];
/**
* Converts this Long to its little endian byte representation.
*/
toBytesLE(): number[];
/**
* Converts this Long to its big endian byte representation.
*/
toBytesBE(): number[];
/**
* Converts this Long to signed.
*/
toSigned(): Long;
/**
* Converts the Long to a string written in the specified radix.
*/
toString(radix?: number): string;
/**
* Converts this Long to unsigned.
*/
toUnsigned(): Long;
/**
* Returns the bitwise XOR of this Long and the given one.
*/
xor(other: Long | number | string): Long;
}
}

86
src/types/offscreencanvas.d.ts vendored Normal file
View File

@ -0,0 +1,86 @@
/* eslint-disable */
// explicit copy of @types/offscreencanvas to enable typedef bundling
// Type definitions for non-npm package offscreencanvas-browser 2019.6
// Project: https://html.spec.whatwg.org/multipage/canvas.html#the-offscreencanvas-interface
// Definitions by: Klaus Reimer <https://github.com/kayahr>
// Oleg Varaksin <https://github.com/ova2>
// Definitions: https://github.com/DefinitelyTyped/DefinitelyTyped
// TypeScript Version: 4.3
// https://html.spec.whatwg.org/multipage/canvas.html#dom-canvas-transfercontroltooffscreen
export interface HTMLCanvasElement {
transferControlToOffscreen(): OffscreenCanvas;
}
// https://html.spec.whatwg.org/multipage/canvas.html#offscreencanvasrenderingcontext2d
export interface OffscreenCanvasRenderingContext2D extends CanvasState, CanvasTransform, CanvasCompositing,
CanvasImageSmoothing, CanvasFillStrokeStyles, CanvasShadowStyles, CanvasFilters, CanvasRect, CanvasDrawPath,
CanvasText, CanvasDrawImage, CanvasImageData, CanvasPathDrawingStyles, CanvasTextDrawingStyles, CanvasPath {
readonly canvas: OffscreenCanvas;
}
declare var OffscreenCanvasRenderingContext2D: {
prototype: OffscreenCanvasRenderingContext2D;
new(): OffscreenCanvasRenderingContext2D;
};
// https://html.spec.whatwg.org/multipage/canvas.html#the-offscreencanvas-interface
// Possible contextId values are defined by the enum OffscreenRenderingContextId { "2d", "bitmaprenderer", "webgl", "webgl2" }
// See also description: https://developer.mozilla.org/en-US/docs/Web/API/OffscreenCanvas/getContext
export interface OffscreenCanvas extends EventTarget {
width: number;
height: number;
getContext(contextId: "2d", contextAttributes?: CanvasRenderingContext2DSettings): OffscreenCanvasRenderingContext2D | null;
getContext(contextId: "bitmaprenderer", contextAttributes?: WebGLContextAttributes): ImageBitmapRenderingContext | null;
getContext(contextId: "webgl", contextAttributes?: WebGLContextAttributes): WebGLRenderingContext | null;
getContext(contextId: "webgl2", contextAttributes?: WebGLContextAttributes): WebGL2RenderingContext | null;
convertToBlob(options?: { type?: string | undefined, quality?: number | undefined }): Promise<Blob>;
transferToImageBitmap(): ImageBitmap;
}
// https://html.spec.whatwg.org/multipage/canvas.html#canvasdrawimage
export interface CanvasDrawImage {
drawImage(image: CanvasImageSource | OffscreenCanvas, dx: number, dy: number): void;
drawImage(image: CanvasImageSource | OffscreenCanvas, dx: number, dy: number, dw: number, dh: number): void;
drawImage(image: CanvasImageSource | OffscreenCanvas, sx: number, sy: number, sw: number, sh: number,
dx: number, dy: number, dw: number, dh: number): void;
}
// https://html.spec.whatwg.org/multipage/imagebitmap-and-animations.html#dom-createimagebitmap
declare function createImageBitmap(image: ImageBitmapSource | OffscreenCanvas): Promise<ImageBitmap>;
declare function createImageBitmap(image: ImageBitmapSource | OffscreenCanvas, sx: number, sy: number,
sw: number, sh: number): Promise<ImageBitmap>;
// OffscreenCanvas should be a part of Transferable => extend all postMessage methods
export interface Worker {
postMessage(message: any, transfer?: Array<Transferable | OffscreenCanvas>): void;
}
export interface ServiceWorker {
postMessage(message: any, transfer?: Array<Transferable | OffscreenCanvas>): void;
}
export interface MessagePort {
postMessage(message: any, transfer?: Array<Transferable | OffscreenCanvas>): void;
}
export interface Window {
postMessage(message: any, targetOrigin: string, transfer?: Array<Transferable | OffscreenCanvas>): void;
}
declare function postMessage(message: any, targetOrigin: string, transfer?: Array<Transferable | OffscreenCanvas>): void;
declare var OffscreenCanvas: {
prototype: OffscreenCanvas;
new(width: number, height: number): OffscreenCanvas;
};

25
src/types/tfjs.esm.d.ts vendored Normal file
View File

@ -0,0 +1,25 @@
/* eslint-disable import/no-unresolved */
/* eslint-disable import/no-extraneous-dependencies */
export * from 'types/tfjs.esm';
export declare const version: {
'tfjs-core': string;
'tfjs-backend-cpu': string;
'tfjs-backend-webgl': string;
'tfjs-data': string;
'tfjs-layers': string;
'tfjs-converter': string;
tfjs: string;
};
export * from '@tensorflow/tfjs-core';
export * from '@tensorflow/tfjs-converter';
export * from '@tensorflow/tfjs-data';
export * from '@tensorflow/tfjs-layers';
export * from '@tensorflow/tfjs-backend-cpu';
export * from '@tensorflow/tfjs-backend-wasm';
export * from '@tensorflow/tfjs-backend-webgl';
export * from '@tensorflow/tfjs-backend-webgpu';
export * from '@tensorflow/tfjs-node';
export * from '@tensorflow/tfjs-node-gpu';

49
src/types/tsconfig.json Normal file
View File

@ -0,0 +1,49 @@
{
"compilerOptions": {
"module": "esnext",
"target": "esnext",
"moduleResolution": "node",
"baseUrl": "./",
"lib": ["esnext", "dom"],
"allowJs": true,
"allowSyntheticDefaultImports": false,
"allowUnreachableCode": false,
"allowUnusedLabels": false,
"alwaysStrict": true,
"declaration": true,
"declarationMap": true,
"emitDecoratorMetadata": true,
"esModuleInterop": false,
"exactOptionalPropertyTypes": true,
"experimentalDecorators": true,
"forceConsistentCasingInFileNames": true,
"importHelpers": true,
"importsNotUsedAsValues": "error",
"isolatedModules": false,
"noEmitHelpers": true,
"noEmitOnError": false,
"noFallthroughCasesInSwitch": true,
"noImplicitAny": false,
"noImplicitOverride": true,
"noImplicitReturns": true,
"noImplicitThis": true,
"noPropertyAccessFromIndexSignature": false,
"noUncheckedIndexedAccess": false,
"noUnusedLocals": true,
"noUnusedParameters": true,
"preserveConstEnums": true,
"pretty": true,
"removeComments": false,
"resolveJsonModule": true,
"skipLibCheck": true,
"sourceMap": true,
"strictBindCallApply": true,
"strictFunctionTypes": true,
"strictNullChecks": true,
"strictPropertyInitialization": true,
"stripInternal": false,
"useDefineForClassFields": true,
"useUnknownInCatchVariables": true
},
"exclude": ["node_modules/", "dist/**/*.js"]
}

2984
src/types/webgpu.d.ts vendored Normal file

File diff suppressed because it is too large Load Diff

View File

@ -56,7 +56,7 @@ export function extractParams(weights: Float32Array, numMainBlocks: number): { p
reduction_block_1: entry_flow_reduction_block_1, reduction_block_1: entry_flow_reduction_block_1,
}; };
const middle_flow = {}; const middle_flow: Record<`main_block_${number}`, MainBlockParams> = {};
range(numMainBlocks, 0, 1).forEach((idx) => { range(numMainBlocks, 0, 1).forEach((idx) => {
middle_flow[`main_block_${idx}`] = extractMainBlockParams(128, `middle_flow/main_block_${idx}`); middle_flow[`main_block_${idx}`] = extractMainBlockParams(128, `middle_flow/main_block_${idx}`);
}); });

View File

@ -58,7 +58,7 @@ export function extractParamsFromWeightMap(
reduction_block_1: entry_flow_reduction_block_1, reduction_block_1: entry_flow_reduction_block_1,
}; };
const middle_flow = {}; const middle_flow: Record<`main_block_${number}`, MainBlockParams> = {};
range(numMainBlocks, 0, 1).forEach((idx) => { range(numMainBlocks, 0, 1).forEach((idx) => {
middle_flow[`main_block_${idx}`] = extractMainBlockParams(`middle_flow/main_block_${idx}`); middle_flow[`main_block_${idx}`] = extractMainBlockParams(`middle_flow/main_block_${idx}`);
}); });

View File

@ -18,7 +18,7 @@ export type TinyXceptionParams = {
reduction_block_0: ReductionBlockParams reduction_block_0: ReductionBlockParams
reduction_block_1: ReductionBlockParams reduction_block_1: ReductionBlockParams
} }
middle_flow: any, middle_flow: Record<`main_block_${number}`, MainBlockParams>,
exit_flow: { exit_flow: {
reduction_block: ReductionBlockParams reduction_block: ReductionBlockParams
separable_conv: SeparableConvParams separable_conv: SeparableConvParams

View File

@ -1,8 +1,6 @@
const fs = require('fs'); const fs = require('fs');
const path = require('path'); const path = require('path');
// eslint-disable-next-line import/no-extraneous-dependencies, node/no-unpublished-require
const log = require('@vladmandic/pilogger'); const log = require('@vladmandic/pilogger');
// eslint-disable-next-line import/no-extraneous-dependencies, node/no-unpublished-require
const tf = require('@tensorflow/tfjs-node'); const tf = require('@tensorflow/tfjs-node');
const faceapi = require('../dist/face-api.node.js'); // this is equivalent to '@vladmandic/faceapi' const faceapi = require('../dist/face-api.node.js'); // this is equivalent to '@vladmandic/faceapi'

View File

@ -15,7 +15,7 @@
"experimentalDecorators": true, "experimentalDecorators": true,
"importHelpers": true, "importHelpers": true,
"noFallthroughCasesInSwitch": true, "noFallthroughCasesInSwitch": true,
"noImplicitAny": false, "noImplicitAny": true,
"noImplicitOverride": true, "noImplicitOverride": true,
"noImplicitReturns": true, "noImplicitReturns": true,
"noImplicitThis": true, "noImplicitThis": true,
@ -32,11 +32,11 @@
"strictBindCallApply": true, "strictBindCallApply": true,
"strictFunctionTypes": true, "strictFunctionTypes": true,
"strictNullChecks": true, "strictNullChecks": true,
"strictPropertyInitialization": true, "strictPropertyInitialization": true
}, },
"formatCodeOptions": { "indentSize": 2, "tabSize": 2 }, "formatCodeOptions": { "indentSize": 2, "tabSize": 2 },
"include": ["src"], "include": ["src"],
"exclude": ["node_modules"], "exclude": ["node_modules/**"],
"typedocOptions": { "typedocOptions": {
"excludePrivate": true, "excludePrivate": true,
"excludeExternals": true, "excludeExternals": true,
@ -48,7 +48,6 @@
"theme": "default", "theme": "default",
"readme": "none", "readme": "none",
"out": "typedoc", "out": "typedoc",
"logLevel": "Verbose", "logLevel": "Verbose"
"logger": "none",
} }
} }

View File

@ -0,0 +1 @@
window.hierarchyData = "eJyVWMtu2zgU/ReumVbiU/SumaBFgTYN2mCAQZEFIzGJJjKVUnTTQeF/H1CqbVLWg9wkcHB4z+NeX1H5DUzb2g5svmOBBMQUEYhwISCiVECUFwVElFOIWEYg4hRBJDIMcYZzSAqUQ1TwDOaYIUg4xXcQGPXQqNLWre7A5jdA7oeWWwU2wD4ACJ5rXYENgWBnGrAB27baNap7ax/ePNltAyAoG9l1Dt1VFw584U53L7JUYA8Bw7lX8W9pannfqGPdHBWHyn2dofIBNksx/GEPgXPi1b9VumvNSvUBdJGvVIdgp+sfO3Utt+pGGqX72NEdBOVT3VRGabD5znB+t4fABe/J+Hhl5Out+mXf16qpvrwM6R5UIcoOqmptlXmQpereVka+vpk8OKvzeBoEmlAu6CBKUE/UoqaJpHpBSXqOXUGIE4/53aP6oHSlzLWyS4w+LoIEF8LP/LLd6arWj5ftr5WofWhyuFgQiAgfEsbC9zmlYMJmDPvJJeE0oFgpvVZywgxnwi0I1juiNAj1qt4q3UWMr4dMjpQyNHAH3+UJ6gnHEbSnLGmQ5cf3slRXyqqybvWaPR+bbpAXg0Hub4ljSevzT3gMgBE2WUbGNj9JXW2leV5tYwBONsoyfjco4COj5wJmjK6TTw0yowQiJoYZZpTMsdNoehoTtGBzRKyIZmJFBBWnaNzTz9KWTzH97IHJveR02HGc0pHJkHjG3jKpZ4z5C+eTvFeNqlbW3AmVNCQCD44KnnmUX+7/VaWN+iKOoEnkhx0gMuw38qattV1rYg9KbqDI2B9Gf0pDwgmTy2Snxgnsz8WNUVVd2tXW+bh1Endv9dP6qsrVsBwmOSvC6fEZ6H57pAHnhKNFvpMTko93/2EBrNyKRtAIKorm9h8rEsh6cMqMO499gByLkYCvqmwfde2+NREKQnSEY4F8wr/a7UvbuReIW9k9L5GFyCSvgiBIspxDkonBtSBoJGJn1fDs7kpTv9jWdI7mUnaLL0DLJ9NEshxiUdBBH8vP9b1rmgmiCHnTB2N6VdBzGd9q/diMLEcKmTu6LsW1z79w9nvdFYpq0wQ8pTckwwSSjPUrx304U/In4dWGnKNjrDN2RnhKMo4yxMeQimIy7+NVKC33s2NJ+ecohySn/fu7+zCXf8AS3YfgVEQ0OT0XcMo3UcPMwQgZGGVzbygRe9vHRpCRzB+Hb131ub2vG6WV/ZkvUYXIpKZjlPX9ZsEj8rbW/530tybivyMzRyJcc4pH1P+0TfsTRZIG4CTvDJHeexHM+rXaGdlcK/vamsXRCoBJV1/ECXR3Aug6frff7/8H6ry9zQ=="

View File

@ -1,22 +1,92 @@
:root { :root {
--light-code-background: #F5F5F5; --light-hl-0: #0000FF;
--dark-hl-0: #569CD6;
--light-hl-1: #000000;
--dark-hl-1: #D4D4D4;
--light-hl-2: #0070C1;
--dark-hl-2: #4FC1FF;
--light-hl-3: #795E26;
--dark-hl-3: #DCDCAA;
--light-hl-4: #098658;
--dark-hl-4: #B5CEA8;
--light-hl-5: #001080;
--dark-hl-5: #9CDCFE;
--light-hl-6: #AF00DB;
--dark-hl-6: #C586C0;
--light-hl-7: #008000;
--dark-hl-7: #6A9955;
--light-hl-8: #A31515;
--dark-hl-8: #CE9178;
--light-hl-9: #267F99;
--dark-hl-9: #4EC9B0;
--light-code-background: #FFFFFF;
--dark-code-background: #1E1E1E; --dark-code-background: #1E1E1E;
} }
@media (prefers-color-scheme: light) { :root { @media (prefers-color-scheme: light) { :root {
--hl-0: var(--light-hl-0);
--hl-1: var(--light-hl-1);
--hl-2: var(--light-hl-2);
--hl-3: var(--light-hl-3);
--hl-4: var(--light-hl-4);
--hl-5: var(--light-hl-5);
--hl-6: var(--light-hl-6);
--hl-7: var(--light-hl-7);
--hl-8: var(--light-hl-8);
--hl-9: var(--light-hl-9);
--code-background: var(--light-code-background); --code-background: var(--light-code-background);
} } } }
@media (prefers-color-scheme: dark) { :root { @media (prefers-color-scheme: dark) { :root {
--hl-0: var(--dark-hl-0);
--hl-1: var(--dark-hl-1);
--hl-2: var(--dark-hl-2);
--hl-3: var(--dark-hl-3);
--hl-4: var(--dark-hl-4);
--hl-5: var(--dark-hl-5);
--hl-6: var(--dark-hl-6);
--hl-7: var(--dark-hl-7);
--hl-8: var(--dark-hl-8);
--hl-9: var(--dark-hl-9);
--code-background: var(--dark-code-background); --code-background: var(--dark-code-background);
} } } }
body.light { :root[data-theme='light'] {
--hl-0: var(--light-hl-0);
--hl-1: var(--light-hl-1);
--hl-2: var(--light-hl-2);
--hl-3: var(--light-hl-3);
--hl-4: var(--light-hl-4);
--hl-5: var(--light-hl-5);
--hl-6: var(--light-hl-6);
--hl-7: var(--light-hl-7);
--hl-8: var(--light-hl-8);
--hl-9: var(--light-hl-9);
--code-background: var(--light-code-background); --code-background: var(--light-code-background);
} }
body.dark { :root[data-theme='dark'] {
--hl-0: var(--dark-hl-0);
--hl-1: var(--dark-hl-1);
--hl-2: var(--dark-hl-2);
--hl-3: var(--dark-hl-3);
--hl-4: var(--dark-hl-4);
--hl-5: var(--dark-hl-5);
--hl-6: var(--dark-hl-6);
--hl-7: var(--dark-hl-7);
--hl-8: var(--dark-hl-8);
--hl-9: var(--dark-hl-9);
--code-background: var(--dark-code-background); --code-background: var(--dark-code-background);
} }
.hl-0 { color: var(--hl-0); }
.hl-1 { color: var(--hl-1); }
.hl-2 { color: var(--hl-2); }
.hl-3 { color: var(--hl-3); }
.hl-4 { color: var(--hl-4); }
.hl-5 { color: var(--hl-5); }
.hl-6 { color: var(--hl-6); }
.hl-7 { color: var(--hl-7); }
.hl-8 { color: var(--hl-8); }
.hl-9 { color: var(--hl-9); }
pre, code { background: var(--code-background); } pre, code { background: var(--code-background); }

File diff suppressed because it is too large Load Diff

18
typedoc/assets/icons.js Normal file

File diff suppressed because one or more lines are too long

Binary file not shown.

Before

Width:  |  Height:  |  Size: 9.4 KiB

1
typedoc/assets/icons.svg Normal file

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 28 KiB

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1 @@
window.navigationData = "eJydnNty4zYSht9F15PJ2mPPzs6dj4lTtuyyVM6mUqkURLYk7FCAAoKyPFv77lsQSREAuxtwLq3++wOJYzcA+vf/Tizs7eTrpDTidfJhshV2Pfk62eiyqaD+0f36cW031eTD5JtU5eTr2YdJsZZVaUBNvv5+dL9QxVqbJ11LK7UaQKCaTYcJJSH1y/8+HFHXRrxe6v3AKCpR1/3DdNbQ/eQUATxuXTk1z+lEKdytKOBeqHIjzDeOGOjeBU0/LSZPFTGHvb2VUJUM96jJhqWfNZZy6DuqvaSyYJai6MF3XJudnn+OmXz9ovR0HSPl0PWClsHXTcjvn+lmvzVQ105+p7aNHYqwb9u4k8TiqIB//OufJ+enXF9MFxFKUwXMnds1WCgs/waYMAV3jldaWd2YgblsVItouZ4mxH0+i0hD4SxskCV4UXuw0EibQUamJIJ7VP5wMuL+4ZHtcrwI2GXWErAw+rUGg/p3tixOuBIN7+NhxstSUD1LozdPcg8VWi8eZxBm0S7qN1VkIw9qhmt11jP2MrbZpEarXeqsGr/Sm61bluHCGPF22SyXfjP2k3vLw7Ts7P74s1Bl5QO9WbFlHjXcVHivBTvBtihPxcEedAnVhbFyKQrL8UJhPvJOLXU21omT6F9mj9MU0WmSoJnVBh6EEiu2UUZaDvwMfzVQ22uwQlZcfYZCDjkTO7jSailXDG4QpVDPUDeVTaBaEYeaGyGVVKvkk4VCDvkryNXa1g9CySXU9kZZ88aAMTkbovSDaxZMXu2qG42/2WjGGq+3boSNhrRP8wQp1qN6MnrlFrwrUVULUXzDkWNdiuxak31KT5BitTV+LazAUYM9j/ST0c2WQx0EeaxjP4g75JgaSVP8bv25lRVQS5TUH30Vs9Z1sp/n86duBkgiPS0DLrQqhAUlggWJeWDCIS7iQ7vsuXAEtgacSzkJyt2+HWZIrqROwgVyUOgSuvahUYEsFzezBsQmE9qKGTSorCcNZImg6gE22p/qYtagyQLN6OAsgI0nuQC4AuvNYMyrRkIeGS74t9qEyznCRj3+RiF8tXBe7ynMhTDvfyvPiy/Mm635FvGFPLLto7MtFDzR0zHAtbVbGuOsjLOs3WQ3K9awARriqxhYJeu2lpnXGjQcSIsyOd49EYPa6B0kZsujhMEYWMnagnHD7lk3FtCMv+WNtRngQwyYCR60LDjj1T0Rg3pta9m9D5hbUVh27sTUHFzaNRoyjbihMB/JT0OImM1356BqjW81tKYo7/VTFaG8KLPdn7bLj+5nek86Ls/LiVtTvK0SJsEvwkixqAAF9EYOMBUbKNuSHgQWO4aC5IbcQXhyjYB6Ux7ilEacZiI+0YhPmYgzGnGWiTinEeeZiHv5DUiIM6YwN9OXwX/X9YoD42b6Ejp/8v3kRqyA8DzYGN8dmDo4sAm8OyvjL8qSGNWiLJnZQexWT1pTk2Jn5RIKYYv1VBsqvj3audxBkGmIM3Geldxevr2IqqHWa0+RzF4oxMHIe+9OqepvjWyusLXrV1m7/RMaE6k4ntxRDLnjEos9FTjBnguaYL8VqryWGyo6GQRc+iArqhM6ExfTCPvQUL6tkfXeM/2/s3L+dNFsuVtBtfRWcK1roGoIR2diPeu12FLjpLMy/nUhKkFFJK2R84atMG4+Y3t5pOJ4crXRkuS0Vs6/kgVVFwdbyvcTWXZr5fz10m7EnvJvrZy/DTboQm872pQLfZsF5dksuEOSKPAKXFvjD6dJ/xOq0nozcioWM8jO05szGGTj9eYMxhnPOEswZEllDc7EelZUx3UmztMIVW91Tbr3dobRKK73dVbG/zsYTS0UBxubbjQ2ONXos43Dz1knbIXebBvrzhfcbOcWJVDkgXCLJV34bY0rcAcGT1oqNK5o2aGO3ZnIeVRfxcJudqA4jLOzgNtK4/FSTzgIWEScyo0ZWAaJQvwEisLEmRQKOs0AxfkUCvqUAYqzKhR0lgGKc6sI9CIqWU6bzQLfT+hZniyNezJ6IRaVtBKdxQKmp+XCE6FW6MTUsg5mzl03Cp2QO3dnZqeWn0CVfgW1GxLtr/R+xMUKWskUvOHQbyr4Vm5X4dI9nVQr9LadZ+QZqC/r094hcPHWXNTfxu6hPUVqLFxUlbvkcg11YeTWalPTYFKeUQ7idSlqZFuH12eUNJNqVUX+7DtRDlxZ7S2mrjaON4TwYmhtdgk55Ezi6CHwZmDEeWVksd/BHFopq7oJ+XvKycMnqUgUcGQRa39IaHtmd2tuDAnM2Rx09osVKdpw5Y7EBZJ8HlJZ7B2/Matv9s9fyGcLJPm8uVRvGcxOlstNEt/BImov63p1IDxPgM6zSZ+/JFCfv6RYD25jEqccTFn+2PU8z5hiPEOhV0qyIyjUcMR7sYAKSjQQGGwZhGi1JGmRjj03ARvdNu5ZvYX3boyopmBftUEm0cDMcR4X/4HCMrNfJOBYUWbXE5BELvIzUMrCEu3kWznKMxRI4e5XzmtWlw96IStQYHcnY//Qnk8iv0NAZRzXTXPDuoGd9MWK99DI5ySEKfZvutK7UxzX2vII7HMFEo53h6YR3r3FOzKViD+mQKIMn0NFGhHmWKPBWPNBviKJQtajmEWsSQgsmv5jELIERJBo/PsAZA6InMPh6/uOh3Dkmhh5Pitj9EVwpkv64ES3DKEXK7hQZZsHd9Nb0B/aA1pclTqpvRyfPra4S/zYcUxwBw2/SrsmSSNBingNS9FUdqijKdgnYYR/PtaiaWXyhFrtpNFqA2r0/Y5nSlGCAP222zGJeajoXeRRV0LAeD8ac2UFs7fawqiZBkuKcZdcFVoipUvxj0MOb/TInKJNwT42FvlO62jIIOBPkt3bhg6KX26O7fm8m701orCXeg91/GFAzB5r88sh6+BvjL35A5RS3FSAjb7AmCSNo+KOggfFCOEZal3toCRJsSBFdFPdxWp0f6b7Occ7zAowUKjIZ47C9hiJhu00Ed0gCJnkFgFNRcKUkEkFKSgx3pYeUNjW9JgByrsNMtwpAhVdBQkuE91eXN38efPvp+eb2ezucfrn/cXlzf0M4+BKBq3Av8U6gNzvjBtzPSrjblS3pYkdD/Q25mihl1DZ0xjG5VEoGksmxlgqrQiRr0Lawyx0uHOKnonEGga3OHyWMdd34d22gRUIGFCB7cRjQFTIgQ0IC1dC7QTawr49E3N7+EahlCLFOwqTYHq7NCbz26YEmtpJwuDcjhKCT3d7TJfE0pl+DOYzfgJNj6ZYw+DK4EQEg4WKJCr9rfhY9j6oe69MsJMm4RlPm/+kwzkHTRs0DA6aopIlCHUtaysUzhuJOODegipHwY8H8wVZoPRMR2nfgR+FRDQdDY4YeOK/JZDi7ALYrkVIs+BxzIRRsegpxrl0gxz5vj0P017WSME6FYNcgi3W5Ho8WFOIX2q84xyNKcC0/ygLfadIkoI9mvnaaPQfTvj2FOZFlqBJyMGauLellfvjtGQeCJHx0EOMwN/gGqu4WziufdueMtd06IPIklA9+6sRBu1agYADKReTw4y+ohsqOJRGbztLzd10ljWeonv+NZ2nYyg6lA4ELIhZXI7GJCA56yOyLGhitkeFWWB2lkdkSSg9u/t2BuM+lTzezCI/CRyrEsigzlnsWJmBHuo9yY6kGfC+9pPoQPgOsIs1s+FHcUYBXjKT5MfaBD5MaFg6Ik3A46SGxaPijALaBCeJ9mQJaLuWB18+hrijgAUVXSaKzgmemYFs3HEMv6JGEg4m1eVCox9kdCbGWWn1IPazZtsPOgwzErFAsxGV/I4uFUcjA9iKklvGPTMHaQ/h/HM5FDaWMVDTjsDvkLHkUFoWX8vv3T8PIpiegAe5XfNoSz3gHO0Mpl43y2XV/n8TDOPbOQwdSGV86JTcvgkVia9W0ps2sSYJpLdqBisP0ePjDw+i8bOPALFzF9Xdtl50wDVgQsUI9cf/ASvHfOE="

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load Diff

Binary file not shown.

Before

Width:  |  Height:  |  Size: 480 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 855 B

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

Some files were not shown because too many files have changed in this diff Show More