Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add mechamarkers to this sim #8

Closed
zepumph opened this issue Apr 7, 2020 · 10 comments
Closed

Add mechamarkers to this sim #8

zepumph opened this issue Apr 7, 2020 · 10 comments
Assignees

Comments

@zepumph
Copy link
Member

zepumph commented Apr 7, 2020

From a tangibles design meeting today, we decided that mechamarkers will be a great technology for prototyping tangible input in this simulation.

After meeting with @Petroochio and @clementzheng, we determined that the best path forward is to just use three markers. One will be a "base" marker, with the plan to attach it stationary near the bottom of the camera view. Then two markers will act as the elements that make the ratio.

I will get started!

Implementation notes:

  • I think I should create a new repo to house mechamarkers code, since now it is used by GFL and Proportion.
  • It was also mentioned that it may be nice to have each object be a different shape so that it is less confusing.
  • For now we will only have tangible input for the Y axis, and we won't worry about the x axis.
@zepumph zepumph self-assigned this Apr 7, 2020
zepumph added a commit to phetsims/tangible that referenced this issue Apr 7, 2020
zepumph added a commit to phetsims/perennial that referenced this issue Apr 7, 2020
zepumph added a commit to phetsims/gravity-force-lab that referenced this issue Apr 7, 2020
zepumph added a commit that referenced this issue Apr 7, 2020
@zepumph
Copy link
Member Author

zepumph commented Apr 8, 2020

I created the tangible repo to house common code between GFL and Proportion. I then added marker input into both screens of Proportion. Unfortunately it broke using the screen to adjust the Y axis in the free-objects screen. I'll get to that first thing next time.

@zepumph
Copy link
Member Author

zepumph commented Apr 8, 2020

@samreid mentioned that handtrack.js could be an interesting technology to explore. https://towardsdatascience.com/handtrackjs-677c29c1d585

@brettfiedler
Copy link
Member

brettfiedler commented Apr 8, 2020

Oh, cool! I think the mechamarkers are still the way to go (attachment to objects for the tangible experience), but if this is super easy, it might be a nice deployable option. And there might be potential here that is not immediately obvious to me.

zepumph added a commit that referenced this issue Apr 8, 2020
zepumph added a commit that referenced this issue Apr 8, 2020
@samreid
Copy link
Member

samreid commented Apr 9, 2020

I think the mechamarkers are still the way to go (attachment to objects for the tangible experience)

Is there research that indicates that moving physical markers is more pedagogically effective than moving your hands without holding physical markers?

@brettfiedler
Copy link
Member

brettfiedler commented Apr 9, 2020

pedagogically effective is relative to the experience or concept you're trying to convey. We're seeking to implement/design tangibles that afford a property relevant to a particular aspect of a given concept (this may include shape, temperature, deformation, size, etc of the object).

There are cases where your hands may be enough! Say you're really interested in conveying the thermal energy transfer of friction? Rubbing your hands together would be a wonderful tangible experience to do so (and control the books in Friction for instance).

My statement is to indicate that our goals may be inclusive of hand tracking, but that our current focus is on the properties of tangible objects that enhance the (non-visual) experience (their affordances for building scientific reasoning and relationship understanding).

@brettfiedler
Copy link
Member

I should also say, that as we head further down the road and are considering "deployment" options, something as lightweight as the handtracking could be an excellent option for embodied sim control without the need for "accessories". However, this does encompass only one way (and a well studied way https://edrl.berkeley.edu/projects/kinemathics/) we hope to enable learners to interface with Proportion.

@brettfiedler
Copy link
Member

brettfiedler commented Apr 13, 2020

Mechamarker implementation & Instructions (By Michael, Clement, Peter and co!): https://docs.google.com/document/d/1s72BACDjC7O5cEuZOA1QiJFH7_t7CMXjHt7x2yMAyHo/edit?usp=sharing

@zepumph
Copy link
Member Author

zepumph commented Apr 15, 2020

@BLFiedler initial mechamarkers support is working for both screens, and has also been integrated with sound over in #9. Is there anything else you would like to do in this issue?

@brettfiedler
Copy link
Member

Let's wait until our tangibles meeting next week. If nothing relevant to the implementation as is comes up, I'll close this one and we can open new issues for specifics. I'll leave it assigned to me for now.

@brettfiedler
Copy link
Member

I think the spirit of this issue has been accomplished. Let's generate new issues for anything that comes up for the mechamarkers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants