We all know how Windows 8 introduced touch as a first class member in user input. If you have used Windows 8 it feels intuitive to touch.. hell I used to touch my non-touch laptop occasionally and not because I wanted to fondle it :)
Rewinding a bit.. in Silverlight and Windows Phone, we had a nice historic Touch.FrameReported event. This was the fastest way to detect pointer / touch interaction on the screen. It also supported multi-touch detection by means of exposing TouchFrameEventArges.GetTouchPoints.
We also know how famously .NET components were sliced and diced (lets leave it for some other time when I have had a drink or two). So its only natural that WinRT 1.0 native XAML didn’t have Touch.FrameReported.
The other high level mechanism for touch event detection was the set of manipulation events
* UIElement.ManipulationDelta and
These made it to WinRT 1.0 XAML.. However there were changes. Now it comes armed with ManipulationMode.
The MEME RTFM definitely applies to me – I never read the manual… When we had proper MSDN lib for documentation it was all good.. now documentation is all over the place.. (Damn it… need to focus more… another thing to talk over a drink).
So what happened.. why trying to detect touch / pointer events, I started using Manipulation events but hell no.. the event’s just wouldn’t get called.. The mode has tons of options but to handle it yourself you need to set the ManipulationMode to All. Someone remind me something.. If I declare a bleeding event handler.. isn’t my intention already pretty clear ? (Something every logical from .NET lost in translation on its way to WinRT design).
So eventually I got that to work by setting ManipulationMode to All. Surprise surprise, the manipulation event args only expose a few things like Position and PointerType…So this definitely can’t do multi-touch.. sure it will detect all the touch inputs but it can’t tell the difference the two fingers!!
Time to give it the middle one and start looking at alternatives. Previously in WPF / Silverlight we had Mice events
Those didn’t directly made it. WinRT merged Mice and Touch events into Pointer events. So now we have
The PointerRoutedEventArgs that are passed to the event handers not only give you current position and intermediate positions, they also give you access to Pointer associated with the event. Each touch point / mice button gets a unique id. This way you can detect activity of each touch points over the screen.
The post is very verbose and lot of gibberish.. let me throw some code snippet to lighten it up :)
private void LayoutRoot_PointerMoved(object sender, PointerRoutedEventArgs e)
Point position2 = e.GetCurrentPoint(this.LayoutRoot).Position;
Pointer pointer2 = e.Pointer;
private void FreeSketch(Point position2, Pointer pointer2)
if (bDraw && pointer1.PointerId == pointer2.PointerId)
Line l = new Line()
X1 = position1.X,
Y1 = position1.Y,
X2 = position2.X,
Y2 = position2.Y,
StrokeThickness = 5,
Stroke = new SolidColorBrush(Colors.Blue),
Fill = new SolidColorBrush(Colors.Blue)
this.LayoutRoot.Children.Add(l); // adding to canvas
this.position1 = position2;