<br><br>
<div><span class="gmail_quote">On 4/3/07, <b class="gmail_sendername">mathew davis</b> <<a href="mailto:someoneinjapan@gmail.com">someoneinjapan@gmail.com</a>> wrote:</span>
<blockquote class="gmail_quote" style="PADDING-LEFT: 1ex; MARGIN: 0px 0px 0px 0.8ex; BORDER-LEFT: #ccc 1px solid">
<div><span class="q">
<blockquote class="gmail_quote" style="PADDING-LEFT: 1ex; MARGIN: 0px 0px 0px 0.8ex; BORDER-LEFT: #ccc 1px solid"><br>To test this, we could capture and share some raw (time,x,y) event<br>streams from a touch screen, and try processing them offline. Then we
<br>don't need actual hardware (a spreadsheet is may even be good enough)<br>to figure out algorithms.</blockquote>
<div> </div></span>
<div>I agree I think this would be very benificial. This would give us a little more to chew on then speculation. Then we could use numbers instead of ambigious scenarious. Do you have some raw (time,x,y) event stream data? I would be very interested in looking at it. If not then who would have it? Or I guess I should say who would be willing to do that?
</div><br> </div></blockquote></div><br>