收藏 分销(赏)

Android多点触摸.doc

上传人:仙人****88 文档编号:9148996 上传时间:2025-03-15 格式:DOC 页数:12 大小:122.20KB 下载积分:10 金币
下载 相关 举报
Android多点触摸.doc_第1页
第1页 / 共12页
Android多点触摸.doc_第2页
第2页 / 共12页


点击查看更多>>
资源描述
How to use Multi-touch in Android 2 This is the first in a series of articles on developing multi-touch applications with Android 2.x. It is excerpted from Chapter 11 of the book “Hello, Android! (3rd edition)”, available in beta now at The Pragmatic Programmers. Introducing multi-touch Multi-touch is simply an extension of the regular touch-screen user interface, using two or more fingers instead of one. We’ve used single-finger gestures before, although we didn’t call it that. In Chapter 4 we let the user touch a tile in the Sudoku game in order to change it. That’s called a “tap” gesture. Another gesture is called “drag”. That’s where you hold one finger on the screen and move it around, causing the content under your finger to scroll. Tap, drag, and a few other single-fingered gestures have always been supported in Android. But due to the popularity of the Apple iPhone, early Android users sufferedfrom a kind of gesture envy. The iPhone supported multi-touch, in particular the “pinch zoom” gesture. Three common touch gestures: a) tap, b) drag, and c) pinch zoom. (Image courtesy of GestureW) With pinch zoom, you place two fingers on the screen and squeeze them together to make the item you’re viewing smaller, or pull them apart to make it bigger. Before Android 2.0 you had to use a clunky zoom control with icons that you pressed to zoom in and out (for example the setBuiltInZoomControls() in the MyMap example). But thanks to its new multi-touch support, you can now pinch to zoom on Android too! As long as the application supports it, of course. Note: If you try to run the example in this chapter on Android 1.5 or 1.6, it will crash because those versions do not support multi-touch. We’ll learn how to work around that in chapter 13, “Write Once, Test Everywhere”. Warning: Multi-bugs ahead Multi-touch, as implemented on current Android phones is extremely buggy. In fact it’s so buggy that it borders on the unusable. The API routinely reports invalid or impossible data points, especially during the transition from one finger to two fingers on the screen and vice-versa. On the developer forums you can find complaints of fingers getting swapped, x and y axes flipping, and multiple fingers sometimes being treated as one. With a lot of trial and error, I was able to get the example in this chapter working because the gesture it implements is so simple. Until Google acknowledges and fixes the problems, thatmay be about all you can do. Luckily, pinch zoom seems to be the only multi-touch gesture most people want. The Touch example To demonstrate multi-touch, we’re going to build a simple image viewer application that lets you zoom in and scroll around an image. Here’s a screenshot of the finished product: The Touch example implements a simple image viewer with drag and pinch zoom. Building the Touch example To demonstrate multi-touch, we’re going to build a simple image viewer application that lets you zoom in and scroll around an image. See Part 1 for a screenshot of the finished product. Begin by creating a new “Hello, Android” project with the following parameters in the New Android Project dialog box: Project name: Touch Build Target: Android 2.1 Application name: Touch Package name: org.example.touch Create Activity: Touch This will create Touch.java to contain your main activity. Let’s edit it to show a sample image, put in a touch listener, and add a few imports we’ll need later: From Touchv1/src/org/example/touch/Touch.java: package org.example.touch; import android.app.Activity; import android.graphics.Matrix; import android.graphics.PointF; import android.os.Bundle; import android.util.FloatMath; import android.util.Log; import android.view.MotionEvent; import android.view.View; import android.view.View.OnTouchListener; import android.widget.ImageView; public class Touch extends Activity implements OnTouchListener { private static final String TAG = "Touch" ; @Override public void onCreate(Bundle savedInstanceState) { super. onCreate(savedInstanceState); setContentView(R.layout.main); ImageView view = (ImageView) findViewById(R.id.imageView); view.setOnTouchListener(this); } @Override public boolean onTouch(View v, MotionEvent event) { // Handle touch events here... } } We’ll fill out that onTouch( ) method in a moment. First we need to define the layout for our activity: From Touchv1/res/layout/main.xml: <?xml version="1.0" encoding="utf-8"?> <FrameLayout xmlns:android=" android:layout_width="fill_parent" android:layout_height="fill_parent" > <ImageView android:id="@+id/imageView" android:layout_width="fill_parent" android:layout_height="fill_parent" android:src="@drawable/butterfly" android:scaleType="matrix" > </ImageView> </FrameLayout> The entire interface is a big ImageView control that covers the whole screen. The android:src=”@drawable/butterfly” value refers to the butterfly image used in the example. You can use any JPG or PNG format image you like, just put it in the res/drawables-nodpi directory. The android:scaleType=”matrix” attribute indicates we’re going to use a matrix to control the position and scale of the image. More on that later. The AndroidManifest.xml file is untouched except for the addition of the android:theme= attribute: From Touchv1/AndroidManifest.xml: <manifest xmlns:android=" package="org.example.touch" android:versionCode="1" android:versionName="1.0" > <application android:icon="@drawable/icon" android:label="@string/app_name" android:theme="@android:style/Theme.NoTitleBar.Fullscreen" > <activity android:name=".Touch" android:label="@string/app_name" > <intent-filter> <action android:name="android.intent.action.MAIN" /> <category android:name="android.intent.category.LAUNCHER" /> </intent-filter> </activity> </application> <uses-sdk android:minSdkVersion="3" android:targetSdkVersion="7" /> </manifest> @android:style/Theme.NoTitleBar.Fullscreen, as the name suggests, tells Android to use the entire screen with no title bar or status bar at the top. You can run the application now and it will simply display the picture. Understanding touch events Whenever I first learn a new API, I like to first put in some code to dump everything out so I can get a feel for what the methods do and in what order events happen. So let’s start with that. First add a call to the dumpEvent() method inside onTouch(): From Touchv1/src/org/example/touch/Touch.java: @Override public boolean onTouch(View v, MotionEvent event) { // Dump touch event to log dumpEvent(event); return true; // indicate event was handled } Note that we need to return true to indicate to Android that the event has been handled. Next, define the dumpEvent() method. The only parameter is the event that we want to dump. From Touchv1/src/org/example/touch/Touch.java: /** Show an event in the LogCat view, for debugging */ private void dumpEvent(MotionEvent event) { String names[] = { "DOWN" , "UP" , "MOVE" , "CANCEL" , "OUTSIDE" , "POINTER_DOWN" , "POINTER_UP" , "7?" , "8?" , "9?" }; StringBuilder sb = new StringBuilder(); int action = event.getAction(); int actionCode = action & MotionEvent.ACTION_MASK; sb.append("event ACTION_" ).append(names[actionCode]); if (actionCode == MotionEvent.ACTION_POINTER_DOWN || actionCode == MotionEvent.ACTION_POINTER_UP) { sb.append("(pid " ).append( action >> MotionEvent.ACTION_POINTER_ID_SHIFT); sb.append(")" ); } sb.append("[" ); for (int i = 0; i < event.getPointerCount(); i++) { sb.append("#" ).append(i); sb.append("(pid " ).append(event.getPointerId(i)); sb.append(")=" ).append((int) event.getX(i)); sb.append("," ).append((int) event.getY(i)); if (i + 1 < event.getPointerCount()) sb.append(";" ); } sb.append("]" ); Log.d(TAG, sb.toString()); } Output will go to the Android debug log, which you can see by opening the LogView view (see Section 3.10, Debugging with Log Messages). The easiest way to understand this code is to run it. Unfortunately you can’t run this program on the Emulator (actually you can, but the Emulator doesn’t support multi-touch so the results won’t be very interesting). So hook up a real phone to your USB port and run the sample there (see Section 1.4, Running on a Real Phone). When I tried it on my phone and performed a few quick gestures, I received the output below: 1. event ACTION_DOWN[#0(pid 0)=135,179] 2. event ACTION_MOVE[#0(pid 0)=135,184] 3. event ACTION_MOVE[#0(pid 0)=144,205] 4. event ACTION_MOVE[#0(pid 0)=152,227] 5. event ACTION_POINTER_DOWN(pid 1)[#0(pid 0)=153,230;#1(pid 1)=380,538] 6. event ACTION_MOVE[#0(pid 0)=153,231;#1(pid 1)=380,538] 7. event ACTION_MOVE[#0(pid 0)=155,236;#1(pid 1)=364,512] 8. event ACTION_MOVE[#0(pid 0)=157,240;#1(pid 1)=350,498] 9. event ACTION_MOVE[#0(pid 0)=158,245;#1(pid 1)=343,494] 10. event ACTION_POINTER_UP(pid 0)[#0(pid 0)=158,247;#1(pid 1)=336,484] 11. event ACTION_MOVE[#0(pid 1)=334,481] 12. event ACTION_MOVE[#0(pid 1)=328,472] 13. event ACTION_UP[#0(pid 1)=327,471] Here’s how to interpret the events: · On line 1 we see an ACTION_DOWN event so the user must have pressed one finger on the screen. The finger was positioned at coordinates x=135, y=179, which is near the upper left of the display. You can’t tell yet whether they’re trying to do a tap or a drag. · Next, starting on line 2 there are some ACTION_MOVE events, indicating the user moved their finger around a bit to those coordinates given in the events. (It’s actually very hard to put your finger on the screen and not move it at all, so you’ll get a lot of these.) By the amount moved you can tell the user is doing a drag gesture. · The next event, ACTION_POINTER_DOWN on line 5, means the user pressed down another finger. “pid 1” means that pointer id 1 (that is, finger number 1) was pressed. Finger number 0 was already down, so we now have two fingers being tracked on the screen. In theory, the Android API can support up to 256 fingers at once, but the first crop of Android 2.x phones is limited to 2. The coordinates for both fingers come back as part of the event. It looks like the user is about to start a pinch zoom gesture. · Here’s where it gets interesting. The next thing we see is a series of ACTION_MOVE events starting on line 6. Unlike before, now we have two fingers moving around. If you look closely at the coordinates you can see the fingers are moving closer together as part of a pinch zoom. · Then on line 10 we see an ACTION_POINTER_UP on pid 0. This means that finger number 0 was lifted off the screen. Finger number 1 is still there. Naturally, this ends the pinch zoom gesture. · We see a couple more ACTION_MOVE events starting on line 11, indicating the remaining finger is still moving around a little. If you compare these to the earlier move events, you’ll notice a different pointer id is reported. Unfortunately the touch API is so buggy you can’t always count on that. · Finally, on line 13 we get an ACTION_UP event as the last finger is removed from the screen. Now the code for dumpEvent() should make a little more sense. The getAction() method returns the action being performed (up, down, or move). The lowest 8 bits of the action is the action code itself, and the next 8 bits is the pointer (finger) id, so we have to use a bitwise AND (&) and a right shift (>>) to separate them. Then we call the getPointerCount( ) method to see how many finger positions are included. getX( ) and getY() return the X and Y coordinates, respectively. The fingers can appear in any order, so we have to call the getPointerId() to find out which fingers we’re really talking about. That covers the raw mouse event data. The trick, as you might imagine, is in interpreting and acting on that data. Setting up for Image Transformation In order to move and zoom the image we’ll use a neat little feature on the ImageView class called matrix transformation. Using a matrix we can represent any kind of translation, rotation, or skew that we want to do to the image. We already turned it on by specifying android:scaleType=”matrix” in the res/layout/main.xml file. In the Touch class, we need to declare two matrices as fields (one for the current value and one for the original value before the transformation). We’ll use them in the onTouch( ) method to transform the image. We also need a mode variable to tell whether we’re in the middle of a drag or zoom gesture: From Touchv1/src/org/example/touch/Touch.java: public class Touch extends Activity implements OnTouchListener { // These matrices will be used to move and zoom image Matrix matrix = new Matrix(); Matrix savedMatrix = new Matrix(); // We can be in one of these 3 states static final int NONE = 0; static final int DRAG = 1; static final int ZOOM = 2; int mode = NONE; @Override public boolean onTouch(View v, MotionEvent event) { ImageView view = (ImageView) v; // Dump touch event to log dumpEvent(event); // Handle touch events here... switch (event.getAction() & MotionEvent.ACTION_MASK) { } // Perform the transformation view.setImageMatrix(matrix); return true; // indicate event was handled } } The matrix variable will be calculated inside the switch statement when we implement the gestures. Implementing the Drag Gesture A drag gesture starts when the first finger is pressed to the screen (ACTION_DOWN) and ends when it is removed (ACTION_UP or ACTION_POINTER_UP). From: Touchv1/src/org/example/touch/Touch.java: switch (event.getAction() & MotionEvent.ACTION_MASK) { case MotionEvent.ACTION_DOWN: savedMatrix.set(matrix); start.set(event.getX(), event.getY()); Log.d(TAG, "mode=DRAG" ); mode = DRAG; break; case MotionEvent.ACTION_UP: case MotionEvent.ACTION_POINTER_UP: mode = NONE; Log.d(TAG, "mode=NONE" ); break; case MotionEvent.ACTION_MOVE: if (mode == DRAG) { matrix.set(savedMatrix); matrix.postTranslate(event.getX() - start.x, event.getY() - start.y); } break; } When the gesture starts we remember the current value of the transformation matrix and the starting position of the pointer. Every time the finger moves, we start the transformation matrix over at its original value and call the postTranslate( ) method to add a translation vector, the difference between the current and starting positions. I
展开阅读全文

开通  VIP会员、SVIP会员  优惠大
下载10份以上建议开通VIP会员
下载20份以上建议开通SVIP会员


开通VIP      成为共赢上传

当前位置:首页 > 包罗万象 > 大杂烩

移动网页_全站_页脚广告1

关于我们      便捷服务       自信AI       AI导航        抽奖活动

©2010-2025 宁波自信网络信息技术有限公司  版权所有

客服电话:4009-655-100  投诉/维权电话:18658249818

gongan.png浙公网安备33021202000488号   

icp.png浙ICP备2021020529号-1  |  浙B2-20240490  

关注我们 :微信公众号    抖音    微博    LOFTER 

客服