@learn-hunger/visual-gestures
v0.0.3
Published
VisualGestures.js is a package that empowers users to effortlessly control the cursor, including actions such as hover, click, drag, and drop, through precise finger movements in the air.
Downloads
24
Readme
VisualGestures.js
VisualGestures.js is an open-source TypeScript package that empowers users to effortlessly control the cursor, including actions such as hover, click, drag, and drop, through precise finger movements in the air. Immersive and Engaging User Experience Offers a unique and engaging user experience, often perceived as more immersive, and touch-free compared to conventional interfaces, particularly appealing for applications in gaming, virtual reality, and creative industries. Offline Support Developed in TypeScript, works anywhere with full offline functionality without the continuous requirement of internet, making it remotely available Customizable for Various Applications Tailored for different industries such as controlling machinery in factories, navigating interfaces in automotive displays, interacting with public information kiosks without touching a screen. Click here to watch demo of how it works!
Table of Contents
Getting Started
1. Install our npm package
npm install @learn-hunger/visual-gestures
2. Integrate into your existing website
import { VisualGestures } from "@learn-hunger/visual-gestures/dist/"; /** *create instance of visual-gestures *which accepts optional parameters of container and the landmark to be used as pointer *[Default body and landmark 8 is used respectively] */ const vg = new VisualGestures(); // get hand landmarks from mediapipe's taskvision // here video corresponds to 'HTMLVideoElement' with live webcam stream const landmarks = handDetector.detectForVideo(video, performance.now()); vg.detect(landmarks.landmarks[0], performance.now()); // Virtual cursor can be seen once model loading and detection started successfully
For more information about handDetector, refer to the mediapipe handLandmarker documentation. 3. Available Events Refer to the quick guide below for effective gesture usage.
// comprehensive list of all potential event types can be found within the 'EVgMouseEvents' import { EVgMouseEvents } from "@learn-hunger/visual-gestures/dist/app/utilities/vg-constants"; // currently offered cursor control events vgPointerMove(); // corresponds to 'onmousemove' vgPointerEnter(); // corresponds to 'onmouseenter' vgPointerLeave(); // corresponds to 'onmouseleave' vgPointerDown(); // corresponds to 'onmousedown' vgPointerUp(); // corresponds to 'onmouseup' vgPointerClick(); // corresponds to 'onclick' vgPointerDrag(); // corresponds to 'onmousedrag' ('onclick'+'onmousemove') vgPointerDrop(); // corresponds to 'onmousedrop' ('onclick'+'onmousemove'+'onmouseup')
For each event, you can use a callback on the vgInstance[3.1] or via traditional event listeners [3.2], similar to cursor-based controls.
3.1. Instance Based Listening
Function corresponds to 'onmousemove' event in traditional cursor-based controls
vg.mouseEvents.onPointerMove = () => { // console.log("callback pointer moved"); };
3.2. Traditional Event Based Listening
Function corresponds to 'onmousemove' event in traditional cursor-based controls
import { EVgMouseEvents } from "@learn-hunger/visual-gestures/dist/app/utilities/vg-constants"; document.addEventListener(EVgMouseEvents.MOUSE_MOVE, () => { // console.log("callback pointer moved"); });
Similarily MOUSE_ENTER, MOUSE_LEAVE, MOUSE_DOWN, MOUSE_UP, MOUSE_CLICK, MOUSE_DRAG, MOUSE_DROP events can be listened via instance based or traditional based listening.
Comprehensive Ecosystem
Our custom-built project seamlessly integrates tools like Central Logger, Vercel auto build, GitHub release management, debugging tools, CI/CD pipelines, and automated code reviews, providing developers with the flexibility and performance needed to innovate and contribute effortlessly.
Compatibility
Desktop Platforms |OS/Browser|Chrome|Edge|FireFox|Safari|Opera| |:---:|:---:|:---:|:---:|:---:|:---:| |Windows| ✔️| ✔️ | ✔️ | ✔️| ✔️ | |macOS| ✔️| ✔️ | ✔️ | ✔️| ✔️ | |Ubuntu LTS 18.04| ✔️| ✔️ | ✔️ | ✔️| ✔️ |
Mobile Platforms |OS/Browser|Chrome|Edge|FireFox|Safari|Opera |:---:|:---:|:---:|:---:|:---:|:---:| |iOS| ✔️| ✔️ | ✔️ | ✔️| ✔️ | |Android| ✔️| ✔️ | ✔️ | ✔️| ✔️ |
Future Scope
As we continue to refine our system, future improvements will focus on enhancing algorithmic precision for more accurate gesture recognition in diverse environments, including low-light conditions and faster gestures, ensuring a more robust and responsive interaction experience.
Contribute
We'd love to embrace your contribution to VisualGestures.js. Please refer to CONTRIBUTING.md ⭐ Starring the repository to show your appreciation. 🐛 Reporting bugs and suggesting improvements by opening issues. 🔥 Contributing to the development by submitting pull requests.
Support
We greatly appreciate your support in making VisualGestures even better! 🌍 Sharing the project with your community to help spread the word. 💼 If you're interested in sponsoring our work, we would love your support! Your sponsorship will help us continue innovating and delivering high-quality updates. Please reach out to us directly for more information. Your kind feedback, contributions, and sponsorships are invaluable in helping us continue to improve and grow this project!
Maintainers
Nagendra Dharmireddi& Boddu Sri Pavan Join our Discord community to engage with us directly, and don't forget to follow us on LinkedIn and GitHub to stay updated on our latest projects, collaborations, and innovations!
Citation
@software{ package = {@learn-hunger/visual-gestures}, authors = {Nagendra Dharmireddi& Boddu Sri Pavan}, title = {{visual-gestures}}, year = {2024}, version = {0.0.1}, url = {https://github.com/learn-hunger/visual-gestures}, howpublished = {\url{https://www.npmjs.com/package/@learn-hunger/visual-gestures}} }