Hand reading app with deep learning?
Is it really possible to correlated future of people based on scanning, deep learning their hand?
I am thinking about an app where people can take a pic of their hand and upload it to a remote database.
The deep learning algorithm infers and correlates, future of the audience.
I know of no long-term studies of palm reading. In my view, this would be a gimmick.
However, you could machine learning to take a hand and output the following characteristics:
- whether it is a hand
- relative length, curvature, and forks of "head line"
- relative length, curvature, and forks of "heart line"
- relative length, curvature, and forks of "life line"
These factors can influence the selection from a finite set of traits that have a corresponding paragraph that the user can read. These would be written by a copywriter or taken from a palm reading guide.