mirror of https://github.com/vladmandic/human
update docs
parent
5783a9557f
commit
8d403c2d9d
37
Demos.md
37
Demos.md
|
@ -194,4 +194,41 @@ found 0 vulnerabilities
|
|||
...
|
||||
```
|
||||
|
||||
## Human as a Daemon
|
||||
|
||||
If you want to run `Human` as a `systemd` service on `Linux`,
|
||||
take a look at included sample `human.service` file and
|
||||
modify NodeJS path in `ExecStart` and your working folder in `WorkingDirectory`
|
||||
|
||||
```text
|
||||
[Unit]
|
||||
Description=human
|
||||
After=network.target network-online.target
|
||||
|
||||
[Service]
|
||||
Type=simple
|
||||
Environment="NODE_ENV=production"
|
||||
ExecStart=/home/vlado/.nvm/versions/node/v15.7.0/bin/node server/serve.js
|
||||
WorkingDirectory=/home/vlado/dev/human
|
||||
StandardOutput=inherit
|
||||
StandardError=inherit
|
||||
Restart=always
|
||||
RestartSec=300
|
||||
User=vlado
|
||||
StandardOutput=null
|
||||
|
||||
[Install]
|
||||
WantedBy=multi-user.target
|
||||
```
|
||||
|
||||
To activate service:
|
||||
|
||||
- copy the content to your `/etc/systemd/system` folder
|
||||
- reload service configuration: `sudo systemctl daemon-reload`
|
||||
- start service with: `sudo systemctl start human`
|
||||
- stop service with: `sudo systemctl stop human`
|
||||
- check status with: `sudo systemctl status human`
|
||||
- to run service on system startup: `sudo systemctl enable human`
|
||||
- to disable running service on system startup: `sudo systemctl disable human`
|
||||
|
||||
<br>
|
||||
|
|
49
Home.md
49
Home.md
|
@ -1,18 +1,31 @@
|
|||
# Human Library
|
||||
|
||||
## 3D Face Detection, Face Embedding & Recognition, Body Pose Tracking, Hand & Finger Tracking, Iris Analysis, Age & Gender & Emotion Prediction & Gesture Recognition
|
||||
**3D Face Detection & Rotation Tracking, Face Embedding & Recognition,**
|
||||
**Body Pose Tracking, Hand & Finger Tracking,**
|
||||
**Iris Analysis, Age & Gender & Emotion Prediction**
|
||||
**& Gesture Recognition**
|
||||
|
||||
<br>
|
||||
|
||||
### Project pages
|
||||
Native JavaScript module using TensorFlow/JS Machine Learning library
|
||||
Compatible with *Browser*, *WebWorker* and *NodeJS* execution on both Windows and Linux
|
||||
|
||||
- [**Live Demo**](https://vladmandic.github.io/human/demo/index.html)
|
||||
- Browser/WebWorker: Compatible with *CPU*, *WebGL*, *WASM* and *WebGPU* backends
|
||||
- NodeJS: Compatible with software *tfjs-node* and CUDA accelerated backends *tfjs-node-gpu*
|
||||
|
||||
<br>
|
||||
|
||||
## Project pages
|
||||
|
||||
- [**Live Demo**](https://vladmandic.github.io/human/demo/index.html)
|
||||
- [**Code Repository**](https://github.com/vladmandic/human)
|
||||
- [**NPM Package**](https://www.npmjs.com/package/@vladmandic/human)
|
||||
- [**Issues Tracker**](https://github.com/vladmandic/human/issues)
|
||||
- [**Change Log**](https://github.com/vladmandic/human/CHANGELOG.md)
|
||||
|
||||
### Wiki pages
|
||||
<br>
|
||||
|
||||
## Wiki pages
|
||||
|
||||
- [**Home**](https://github.com/vladmandic/human/wiki)
|
||||
- [**Demos**](https://github.com/vladmandic/human/wiki/Demos)
|
||||
|
@ -23,7 +36,9 @@
|
|||
- [**Face Embedding and Recognition**](https://github.com/vladmandic/human/wiki/Embedding)
|
||||
- [**Gesture Recognition**](https://github.com/vladmandic/human/wiki/Gesture)
|
||||
|
||||
### Additional notes
|
||||
<br>
|
||||
|
||||
## Additional notes
|
||||
|
||||
- [**Notes on Backends**](https://github.com/vladmandic/human/wiki/Backends)
|
||||
- [**Development Server**](https://github.com/vladmandic/human/wiki/Development-Server)
|
||||
|
@ -35,14 +50,28 @@
|
|||
|
||||
<br>
|
||||
|
||||
Compatible with *Browser*, *WebWorker* and *NodeJS* execution on both Windows and Linux
|
||||
## Default models
|
||||
|
||||
- Browser/WebWorker: Compatible with *CPU*, *WebGL*, *WASM* and *WebGPU* backends
|
||||
- NodeJS: Compatible with software *tfjs-node* and CUDA accelerated backends *tfjs-node-gpu*
|
||||
- (and maybe with React-Native as it doesn't use any DOM objects)
|
||||
Default models in Human library are:
|
||||
|
||||
- **Face Detection**: MediaPipe BlazeFace-Back
|
||||
- **Face Mesh**: MediaPipe FaceMesh
|
||||
- **Face Iris Analysis**: MediaPipe Iris
|
||||
- **Emotion Detection**: Oarriaga Emotion
|
||||
- **Gender Detection**: Oarriaga Gender
|
||||
- **Age Detection**: SSR-Net Age IMDB
|
||||
- **Body Analysis**: PoseNet
|
||||
- **Face Embedding**: Sirius-AI MobileFaceNet Embedding
|
||||
|
||||
Note that alternative models are provided and can be enabled via configuration
|
||||
For example, `PoseNet` model can be switched for `BlazePose` model depending on the use case
|
||||
|
||||
For more info, see [**Configuration Details**](https://github.com/vladmandic/human/wiki/Configuration) and [**List of Models**](https://github.com/vladmandic/human/wiki/Models)
|
||||
|
||||
<br>
|
||||
|
||||
*This is a pre-release project, see [issues](https://github.com/vladmandic/human/issues) for list of known limitations and planned enhancements*
|
||||
*See [**issues**](https://github.com/vladmandic/human/issues?q=) and [**discussions**](https://github.com/vladmandic/human/discussions) for list of known limitations and planned enhancements*
|
||||
|
||||
*Suggestions are welcome!*
|
||||
|
||||
<br><hr><br>
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
# Profiling
|
||||
|
||||
If `config.profile` is enabled, call to `human.profile()` will return detailed profiling data from the last detect invokation.
|
||||
If `config.profile` is enabled, call to `human.profileData()` will return detailed profiling data from the last detect invokation.
|
||||
|
||||
example:
|
||||
|
||||
|
|
8
Usage.md
8
Usage.md
|
@ -26,12 +26,12 @@ or if you want to use promises
|
|||
Additionally, `Human` library exposes several objects and methods:
|
||||
|
||||
```js
|
||||
human.config // access to configuration object, normally set as parameter to detect()
|
||||
human.defaults // read-only view of default configuration object
|
||||
human.models // dynamically maintained list of object of any loaded models
|
||||
human.version // string containing version of human library
|
||||
human.tf // instance of tfjs used by human
|
||||
human.config // access to configuration object, normally set as parameter to detect()
|
||||
human.state // <string> describing current operation in progress
|
||||
// progresses through: 'config', 'check', 'backend', 'load', 'run:<model>', 'idle'
|
||||
human.sysinfo // object containing current client platform and agent
|
||||
human.load(config) // explicitly call load method that loads configured models
|
||||
// if you want to pre-load them instead of on-demand loading during 'human.detect()'
|
||||
human.image(image, config?) // runs image processing without detection and returns canvas
|
||||
|
@ -40,6 +40,8 @@ Additionally, `Human` library exposes several objects and methods:
|
|||
human.simmilarity(embedding1, embedding2) // runs simmilarity calculation between two provided embedding vectors
|
||||
// vectors for source and target must be previously detected using
|
||||
// face.embedding module
|
||||
human.models // dynamically maintained list of object of any loaded models
|
||||
human.classes // dynamically maintained list of classes that perform detection on each model
|
||||
```
|
||||
|
||||
Plus additional helper functions inside `human.draw`:
|
||||
|
|
Loading…
Reference in New Issue