• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

objective-c - 从触摸点获取单个像素

[复制链接]
菜鸟教程小白 发表于 2022-12-11 22:25:48 | 显示全部楼层 |阅读模式 打印 上一主题 下一主题

是否可以检测到每个被触摸的像素?更具体地说,当用户触摸屏幕时,是否可以跟踪用户触摸的点簇的所有 x-y 坐标?我如何区分用户何时用拇指绘图和何时用指尖绘图?我想根据用户触摸屏幕的方式来反射(reflect)画笔的差异,并且还想跟踪所有被触摸的像素。

我目前正在使用来自 Apple 开发者网站的 GLPaint 示例中的以下代码:

http://developer.apple.com/library/ios/#samplecode/GLPaint/Introduction/Intro.html

示例代码允许使用预定义的画笔大小进行绘图并沿途跟踪 x-y 坐标。如何根据用户触摸屏幕的方式更改画笔并跟踪所有被触摸的像素?

// Drawings a line onscreen based on where the user touches

- (void) renderLineFromPointCGPoint)start toPointCGPoint)end

{
     NSLog(@"x:%f   y:%f",start.x, start.y);

     static GLfloat*          vertexBuffer = NULL;

     static NSUInteger     vertexMax = 64;

     NSUInteger               vertexCount = 0,

                    count,

                    i;



     [EAGLContext setCurrentContext:context];

     glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);



     // Convert locations from Points to Pixels

     CGFloat scale = self.contentScaleFactor;

     start.x *= scale;

     start.y *= scale;

     end.x *= scale;

     end.y *= scale;



     // Allocate vertex array buffer

     if(vertexBuffer == NULL)

          vertexBuffer = malloc(vertexMax * 2 * sizeof(GLfloat));



     // Add points to the buffer so there are drawing points every X pixels

     count = MAX(ceilf(sqrtf((end.x - start.x) * (end.x - start.x) + (end.y - start.y) * (end.y - start.y)) / kBrushPixelStep), 1);

     for(i = 0; i < count; ++i) {

          if(vertexCount == vertexMax) {

               vertexMax = 2 * vertexMax;

               vertexBuffer = realloc(vertexBuffer, vertexMax * 2 * sizeof(GLfloat));

          }



          vertexBuffer[2 * vertexCount + 0] = start.x + (end.x - start.x) * ((GLfloat)i / (GLfloat)count);

          vertexBuffer[2 * vertexCount + 1] = start.y + (end.y - start.y) * ((GLfloat)i / (GLfloat)count);

          vertexCount += 1;

     }



     // Render the vertex array

     glVertexPointer(2, GL_FLOAT, 0, vertexBuffer);

     glDrawArrays(GL_POINTS, 0, vertexCount);



     // Display the buffer

     glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);

     [context presentRenderbuffer:GL_RENDERBUFFER_OES];

}


// Handles the start of a touch

- (void)touchesBeganNSSet *)touches withEventUIEvent *)event

{

     CGRect                    bounds = [self bounds];

     UITouch*     touch = [[event touchesForView:self] anyObject];

     firstTouch = YES;

     // Convert touch point from UIView referential to OpenGL one (upside-down flip)

     location = [touch locationInView:self];

     location.y = bounds.size.height - location.y;

}

// Handles the continuation of a touch.

- (void)touchesMovedNSSet *)touches withEventUIEvent *)event

{  

     CGRect                    bounds = [self bounds];

     UITouch*               touch = [[event touchesForView:self] anyObject];



     // Convert touch point from UIView referential to OpenGL one (upside-down flip)

     if (firstTouch) {

          firstTouch = NO;

          previousLocation = [touch previousLocationInView:self];

          previousLocation.y = bounds.size.height - previousLocation.y;

     } else {

          location = [touch locationInView:self];

         location.y = bounds.size.height - location.y;

          previousLocation = [touch previousLocationInView:self];

          previousLocation.y = bounds.size.height - previousLocation.y;

     }



     // Render the stroke

     [self renderLineFromPoint:previousLocation toPoint:location];

}



// Handles the end of a touch event when the touch is a tap.

- (void)touchesEndedNSSet *)touches withEventUIEvent *)event

{

     CGRect                    bounds = [self bounds];

    UITouch*     touch = [[event touchesForView:self] anyObject];

     if (firstTouch) {

          firstTouch = NO;

          previousLocation = [touch previousLocationInView:self];

          previousLocation.y = bounds.size.height - previousLocation.y;

          [self renderLineFromPoint:previousLocation toPoint:location];

     }

}


// Handles the end of a touch event.

- (void)touchesCancelledNSSet *)touches withEventUIEvent *)event

{

     // If appropriate, add code necessary to save the state of the application.

     // This application is not saving state.

}



Best Answer-推荐答案


AFAIK 没有 API 可以访问触摸区域进行触摸。考虑到电容式触摸屏的限制,我什至不确定你想要的东西在物理上是否可行。我记得最近在 Cocoa Heads 上的一个演示文稿,其中展示了一些信息在 OS X(通过私有(private) API)上可用于触控板,但不适用于 iOS。

我相信这是图形输入板采用内置传感器技术的特殊触控笔的原因之一。

对于绘图应用程序,部分解决方法可能是像某些桌面应用程序那样模拟“墨迹书写”:如果用户的触摸停留在给定的位置,就好像墨水从“笔”中流出并逐渐扩散一样进行绘制通过“纸”。

关于objective-c - 从触摸点获取单个像素,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/10015774/

回复

使用道具 举报

懒得打字嘛,点击右侧快捷回复 【右侧内容,后台自定义】
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

关注0

粉丝2

帖子830918

发布主题
阅读排行 更多
广告位

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap