关节型机械手设计【四自由度 圆柱坐标式液压驱动】
喜欢这套资料就充值下载吧。资源目录里展示的都可在线预览哦。下载后都有,请放心下载,文件全都包含在内,图纸为CAD格式可编辑,有疑问咨询QQ:414951605 或 1304139763p
毕 业 设 计(论 文)
关节型机械手设计
学生姓名:
张安怀
学 号:
0515011127
所在系部:
机械工程系
专业班级:
05机制(1)班
指导教师:
谢卫容
日 期:
二○○九年六月
Joint-based Robot Design
By
Zhang Anhuai
June 2009
毕 业 设 计(论 文)
英 文 文 献 翻 译
学生姓名:
张安怀
学 号:
0515011127
所在系部:
机械工程系
专业班级:
05机制(1)班
指导教师:
谢卫容
日 期:
二○○九年六月
1
毕业设计(论文)任务书
系 部
机械工程系
指导教师
谢卫容
职 称
学生姓名
张安怀
专业班级
05机制(1)班
学 号
0515011127
设计题目
关节型机械手设计
设
计
内
容
目
标
和
要
求
(设计内容目标和要求、设计进度等)
内容:
1、关节型机械手的总体结构设计
2、机械手各关节传动系统设计
3、机械手控制系统设计
4、绘制装配图,零件图,以及传动和控制系统原理图
要求:
1、收集资料(相关书籍5本以上,文章10篇以上);
2、绘制零件图,装配图,原理图;
3、撰写论文要符合要求;
4、翻译3000字左右的英文文章一篇
进度:
1、1-6周,主要进行毕业设计准备工作,熟悉题目,收集资料,进行毕业实习,明确研究目的和任务,构思总体方案;
2、7-10周,绘制相关的图;
3、11-13周,收尾完善,编写毕业设计论文,准备毕业设计答辩。
指导教师签名:谢卫容
年 月 日
系 部审 核
此表由指导教师填写 由所在系部审核
2-1
毕业设计(论文)学生开题报告
课题名称
关节型机械手设计
课题来源
生产实践
课题类型
AX
指导教师
谢卫容
学生姓名
张安怀
学 号
0515011127
专业班级
05机制(1)班
本课题的研究现状、研究目的及意义
本课题的研究现状、研究目的及意义
研究现状:
机械手是近几十年发展起来的一种高科技自动化生产设备。它的特点是可通过编程来完成各种预期的作业任务,在构造和性能上兼有人和机器各自的优点,尤其体现了人的智能和适应性。机械手作业的准确性和各种环境中完成作业的能力,在国民经济各领域有着广阔的发展前景。
我国国家标准(GB/T 12643-90)对机械手的定义:“具有和人手臂相似的动作功能,可在空间抓放物体,或进行其它操作的机械装置。”机械手可分为专用机械手和通用机械手两大类。专用机械手:它作为整机的附属部分,动作简单,工作对象单一,具有固定(有时可调)程序,使用大批量的自动生产。如自动生产线上的上料机械手,自动换刀机械手,装配焊接机械手等装置。通用机械手:它是一种具有独立的控制系统、程序可变、动作灵活多样的机械手。它适用于可变换生产品种的中小批量自动化生产。它的工作范围大,定位精度高,通用性强,广泛应用于柔性自动线。
机械手最早应用在汽车制造工业,常用于焊接、喷漆、上下料和搬运。机械手延伸和扩大了人的手足和大脑功能,它可替代人从事危险、有害、有毒、低温和高热等恶劣环境中的工作;代替人完成繁重、单调重复劳动,提高劳动生产率,保证产品质量。目前主要应用于制造业中,特别是电器制造、汽车制造、塑料加工、通用机械制造及金属加工等工业。机械手与数控加工中心,自动搬运小车与自动检测系统可组成柔性制造系统(FMS )和计算机集成制造系统(CIMS ),实现生产自动化。随着生产的发展,功能和性能的不断改善和提高,机械手的应用领域日益扩大。
研究目的及意义:
工业机械手具有许多人类无法比拟的优点,满足了社会化大生产的需要,其主要优点如下:
1.能代替人从事危险、有害的操作。只要根据工作环境进行合理设计,选择适当的材料和结构,机械手就可以在异常高温或低温、异常压力和有害气体、粉尘、放射线作用下,以及冲压、灭火等危险环境中胜任工作。工伤事故多的工种,如冲压、压铸、热处理、锻造、喷漆以及有强烈紫外线照射的电弧焊等作业中,应推广工业机械手或机器人。
2.能长时间工作,不怕疲劳,可以把人从繁重单调的劳动中解放出来,并能扩大和延伸人的功能。人在连续工作几小时后,总会感到疲劳或厌倦,而机械手只要注意维护、检修,即能胜任长时间的单调重复劳动。
3.动作准确,因此可以稳定和提高产品的质量,同时又可避免人为的操作错误。
4.机械手特别是通用工业机械手的通用性、灵活性好,能较好地适应产品品种的不断变化,以满足柔性生产的需要。
5.机械手能明显地提高劳动生产率和降低成本。
由于机械手在工业自动化和信息化中发挥了以上巨大的作用,世界各国都很重视工业机械手的应用和发展,机械手的应用在我过还属于起步阶段,就显示出了许多的无法替代的优点,展现了广阔的应用前景。近十几年来,机械手的开发不仅越来越优化,而且涵盖了许多领域,应用的范畴十分广阔。
关节型机械手占用空间小,工作范围大,惯性小,所需动力小,能抓取底面物体,并且还可以绕障碍物选择途径,所以关节型机械手的研究具有很重要的意义。
课题类型:
(1)A—工程实践型;B—理论研究型;C—科研装置研制型;D—计算机软件型;
E—综合应用型
(2)X—真实课题;Y—模拟课题;
(1)、(2)均要填,如AY、BX等。
2-2
本课题的研究内容
1) 拟定整体方案,特别是传感器、控制方式与机械本体的有机结合的设计方案。
2) 根据给定的自由度和技术参数选择合适的手部、腕部和机身的结构。
3) 各部件的设计计算。
4) 工业机械手工作装配图的设计与绘制。
5) 液压系统图的设计与绘制。
6) 电器控制图()的绘制。
7) 编写设计计算说明书。
本课题研究的实施方案、进度安排
实施方案:
收集资料,拟定整体结构设计,根据给定的自由度和技术参数选择合适的手部、腕部和机身的结构。
进度安排:
1).1-6周,主要进行毕业设计准备工作,熟悉题目,收集资料,进行毕业实习,明 确研究目的和任务,构思总体方案
2).7-10周,绘制相关的图
3).11-13周,收尾完善,编写毕业设计论文,准备毕业设计答辩
2-3
已查阅的主要参考文献
1.专著
[1] 李允文.工业机械手设计[M].北京:机械工业出版社,1996
[2] 加藤一郎.机械手图册[M]. 上海:上海科学技术出版社,1979
[3] 第一机械工业部机械研究院机电研究所编.工业机械手图册.北京:第一机械工业部机械研究院机电研究所,1976
2.论文集
[1] 付亚子.机械手控制系统[C].湖北:湖北工业大学,2006
[2] 通用机械手设计[C].湖北:湖北工业大学,2006
指导教师意见
指导教师签名:
年 月 日
3
毕业设计(论文)学生申请答辩表
课 题 名 称
关节型机械手设计
指导教师(职称)
谢卫容
申 请 理 由
已按照指导教师的要求完成设计,特申请答辩
学生所在系部
机械工程系
专业班级
05机制(1)班
学号
0515011127
学生签名: 日期:
毕业设计(论文)指导教师评审表
序号
评分项目(理工科、管理类)
评分项目(文科)
满分
评分
1
工作量
外文翻译
15
2
文献阅读与外文翻译
文献阅读与文献综述
10
3
技术水平与实际能力
创新能力与学术水平
25
4
研究成果基础理论与专业知识
论证能力
25
5
文字表达
文字表达
10
6
学习态度与规范要求
学习态度与规范要求
15
总 分
100
评
语
(是否同意参加答辩)
指导教师签名:
另附《毕业设计(论文)指导记录册》 年 月 日
4
毕业设计(论文)评阅人评审表
学生姓名
张安怀
专业班级
机制(1)班
学号
0515011127
设计(论文)题目
关节型机械手设计
评阅人
评阅人职称
序号
评分项目(理工科、管理类)
评分项目(文科)
满分
评分
1
工作量
外文翻译
15
2
文献阅读与外文翻译
文献阅读与文献综述
10
3
技术水平与实际能力
创新能力与学术水平
25
4
研究成果基础理论与专业知识
论证能力
25
5
文字表达
文字表达
10
6
学习态度与规范要求
学习态度与规范要求
15
总 分
100
评
语
评阅人签名:
年 月 日
5
毕业设计(论文)答辩表
学生姓名
张安怀
专业班级
05机制(1)班
学号
0515011127
设计(论文)题目
关节型机械手设计
序号
评审项目
指 标
满分
评分
1
报告内容
思路清新;语言表达准确,概念清楚,论点正确;实验方法科学,分析归纳合理;结论有应用价值。
40
2
报告过程
准备工作充分,时间符合要求。
10
3
创 新
对前人工作有改进或突破,或有独特见解。
10
4
答 辩
回答问题有理论依据,基本概念清楚。主要问题回答准确,深入。
40
总 分
100
答
辩
组
评
语
答辩组组长(签字): 年 月 日
答
辩
委
员
会
意
见
答辩委员会负责人(签字): 年 月 日
6-1
毕业设计(论文)答辩记录表
学生姓名
张安怀
专业班级
05机制(1)班
学号
0515011127
设计(论文)题目
关节型机械手设计
答辩时间
答辩地点
答辩委员会名单
问题1
提问人:
问题:
回答(要点):
问题2
提问人:
问题:
回答(要点):
问题3
提问人:
问题:
回答(要点):
记录人签名
(不足加附页)
6-2
问题4
提问人:
问题:
回答(要点):
问题5
提问人:
问题:
回答(要点):
问题6
提问人:
问题:
回答(要点):
问题7
提问人:
问题:
回答(要点):
问题8
提问人:
问题:
回答(要点):
记录人签名
7
毕业设计(论文)成绩评定总表
学生姓名:张安怀 专业班级:05机制(1)班
毕业设计(论文)题目:关节型机械手设计
成绩类别
成绩评定
Ⅰ指导教师评定成绩
Ⅱ评阅人评定成绩
Ⅲ答辩组评定成绩
总评成绩
Ⅰ×40%+Ⅱ×20%+Ⅲ×40%
评定等级
注:成绩评定由指导教师、评阅教师和答辩组分别给分(以百分记),最后按“优(90--100)”、“良(80--89)”、“中(70--79)”、“及格(60--69)”、“不及格(60以下)”评定等级。其中,
指导教师评定成绩占40%,评阅人评定成绩占20%,答辩组评定成绩占40%。
摘 要
本文设计的关节型机械手采用圆柱坐标式,能完成上料、翻转等功能。此机械手主要由手爪、手腕、手臂和机身等部分组成,具有手腕回转、手臂伸缩、手臂升降和手臂回转4个自由度,能够满足一般的工业要求。
该机械手由电位器定位,实行点位控制,控制系统采用PLC可编程控制,具有良好的通用性和灵活性。
该机械手为液压驱动,4个自由度和手爪的夹紧都由液压缸驱动,在油路的布置和规划中结合机械制造的基础,不断使油路符合制造的可行性,而且将油路布置成空间结构,使机械手的结构更加简洁和紧凑。
关键字:关节型机械手 圆柱坐标 液压缸 可编程控制
Abstract
In this paper, the design of the joint-type robot using cylindrical coordinates of type, can be completed on the expected, inversion and other functions. Mainly by the manipulator hand, wrist, arm and body parts, etc., with rotating wrists, arms stretching, arm movements and arm rotation four degrees of freedom, able to meet the general requirements of the industry.
The manipulator by the potentiometer position, the implementation of the control points, the control system using PLC programmable control, has a good generality and flexibility.
The manipulator for the hydraulic-driven, four degrees of freedom and the clamping gripper driven by the hydraulic cylinder in the circuit layout and planning based on the combination of machinery manufacturing, and continuously so that the feasibility of manufacturing in line with the circuit, but also circuit layout into a spatial structure, so that the structure of manipulator more concise and compact.
Keywords: joint-type robot cylindrical coordinates hydraulic cylinders PLC.
摘 要
本文设计的关节型机械手采用圆柱坐标式,能完成上料、翻转等功能。此机械手主要由手爪、手腕、手臂和机身等部分组成,具有手腕回转、手臂伸缩、手臂升降和手臂回转4个自由度,能够满足一般的工业要求。
该机械手由电位器定位,实行点位控制,控制系统采用PLC可编程控制,具有良好的通用性和灵活性。
该机械手为液压驱动,4个自由度和手爪的夹紧都由液压缸驱动,在油路的布置和规划中结合机械制造的基础,不断使油路符合制造的可行性,而且将油路布置成空间结构,使机械手的结构更加简洁和紧凑。
关键字:关节型机械手 圆柱坐标 液压缸 可编程控制
Abstract
In this paper, the design of the joint-type robot using cylindrical coordinates of type, can be completed on the expected, inversion and other functions. Mainly by the manipulator hand, wrist, arm and body parts, etc., with rotating wrists, arms stretching, arm movements and arm rotation four degrees of freedom, able to meet the general requirements of the industry.
The manipulator by the potentiometer position, the implementation of the control points, the control system using PLC programmable control, has a good generality and flexibility.
The manipulator for the hydraulic-driven, four degrees of freedom and the clamping gripper driven by the hydraulic cylinder in the circuit layout and planning based on the combination of machinery manufacturing, and continuously so that the feasibility of manufacturing in line with the circuit, but also circuit layout into a spatial structure, so that the structure of manipulator more concise and compact.
Keywords: joint-type robot cylindrical coordinates hydraulic cylinders PLC.
目 录
摘要 …………………………………………………………………………………i
Abstract ……………………………………………………………………………ii
1 绪论 ……………………………………………………………………… 1
1.1 研究目的及意义 ………………………………………………………… 1
1.2 本课题研究内容 ………………………………………………………… 2
2 机械手的总体设计 ……………………………………………………… 3
2.1 工业机械手的组成 ……………………………………………………… 3
2.1.1 执行机构 ………………………………………………………… 3
2.1.2 驱动机构 ……………………………………………………………4
2.1.3 控制系统 ………………………………………………………… 4
2.2 关节型机械手的主要技术参数 ………………………………………… 4
2.3 圆柱坐标式机械手运动简图………………………………………………5
3 关节型机械手机械系统设计 ……………………………………………6
3.1 手部 ………………………………………………………………………6
3.1.1 夹紧力的计算 ……………………………………………………6
3.1.2 夹紧缸驱动力计算 ………………………………………………7
3.1.3 两支点回转型手指的夹持误差分析与计算 ……………………8
3.1.4 夹紧缸的计算 ……………………………………………………10
3.2 腕部 ………………………………………………………………………11
3.2.1 腕部设计的基本要求 ……………………………………………11
3.2.2 腕部回转力矩的计算 ……………………………………………12
3.2.3 手腕回转缸的设计计算 …………………………………………14
3.3 臂部 ………………………………………………………………………15
3.3.1 手臂伸缩液压缸 …………………………………………………15
3.3.2 手臂回转液压缸 …………………………………………………23
4 机械手的液压驱动系统 …………………………………………………27
4.1 程序控制机械手的液压系统 ……………………………………………27
4.2 液压系统 …………………………………………………………………27
4.2.1 各液压缸的换压回路 ……………………………………………27
4.2.2 调速方案 …………………………………………………………28
4.2.3 减速缓冲回路 ……………………………………………………29
4.3 液压系统的合成 …………………………………………………………29
5 机械手的可编程控制 ……………………………………………………31
5.1 输入输出触点的分配 ……………………………………………………31
5.1.1 行程开关的分配 …………………………………………………31
5.1.2 手动按钮的分配 …………………………………………………31
5.1.3 输入输出继电器的分配 …………………………………………32
5.2 外部接线图 ………………………………………………………………32
5.3 控制面板设计 ……………………………………………………………33
5.4 状态控制图 ………………………………………………………………34
5.5 梯形图 ……………………………………………………………………35
结论 …………………………………………………………………………………37
致谢 …………………………………………………………………………………38
参考文献 ……………………………………………………………………………39
iv
Extending Blender: Development of a Haptic Authoring Tool
Abstract -In this paper, we present our work to extend a well known 3D graphic modeler - Blender - to support haptic modeling and rendering. The extension tool is named HAMLAT (Haptic Application Markup Language Authoring Tool). We describe the modifications and additions to the Blender source code which have been used to create HAMLAT Furthermore, we present and discuss the design decisions used when developing HAMLAT, and also an implementation "road map" which describes the changes to the Blender source code. Finally, we conclude
with discussion of our future development and research avenues.
Keywords - Haptics, HAML, Graphic Modelers, Blender, Virtual Environments.
I. INTRODUCTION
A. Motivation
The increasing adoption of haptic modality in human-computer interaction paradigms has led to a huge demand for new tools that help novice users to author and edit haptic applications. Currently, the haptic application development process is a time consuming experience that requires programming expertise. The complexity of haptic applications development rises from the fact that the haptic application components (such as the haptic API, the device, the haptic rendering algorithms, etc.) need to interact with the graphic components in order to achieve synchronicity.
Additionally, there is a lack of application portability as the application is tightly coupled to a specific device that necessitates the use of its corresponding API. Therefore, device and API heterogeneity lead to the fragmentation and disorientation of both researchers and developers. In view of all these considerations, there is a clear need for an authoring tool that can build haptic applications while hiding programming details from the application modeler (such as API, device, or virtual model).
This paper describes the technical development of the Haptic Application Markup Language Authoring Tool (HAMLAT). It is intended to explain the design decisions used for developing HAMLAT and also provides an implementation "road map", describing the source code of the project.
B. Blender
HAMLAT is based on the Blender [1] software suite, which is an open-source 3D modeling package with a rich feature set. It has a sophisticated user interface which is
noted for its efficiency and flexibility, as well as its supports for multiple file formats, physics engine, modem computer graphic rendering and many other features.
Because of Blender's open architecture and supportive community base, it was selected as the platform of choice for development of HAMLAT. The open-source nature of Blender means HAMLAT can easily leverage its existing functionality and focus on integrating haptic features which make it a complete hapto-visual modeling tool, since developing a 3D modeling platform from scratch requires considerable development time and expertise in order to reach the level of functionality of Blender. Also, we can take advantage of future improvements to Blender by merging changes from its source code into the HAMLAT source tree.
HAMLAT builds on existing Blender components, such as the user-interface and editing tools, by adding new components which focus on the representation, modification, and rendering of haptic properties of objectsin a 3D scene. By using Blender as the basis for HAMLAT, we hope to develop a 3D haptic modeling tool
which has the maturity and features of Blender combined
with the novelty of haptic rendering.
At the time of writing, HAMLAT is based on Blender version 2.43 source code.
C. Project Goals
As previously stated, the overall goal for the HAMLAT project is to produce a polished software application which combines the features of a modem graphic modeling tool with haptic rendering techniques. HAMLAT has the "look and feel" of a 3D graphical modeling package, but with the addition of features such as haptic rendering and haptic property descriptors. This allows artists, modelers, and developers to generate realistic 3D hapto-visual virtual environments.
A high-level block diagram of HAMLAT is shown in Figure 1. It illustrates the flow of data in the haptic modeling. HAMLAT assists the modeler, or application developer, in building hapto-visual applications which may be stored in a database for later retrieval by another haptic application. By hapto-visual application we refer to any software which displays a 3D scene both visually and haptically to a user in a virtual setting. An XML file format, called HAML [2], is used to describe the 3D scenes and store the hapto-visual environments built by a modeler for later playback to an end user.
Traditionally, building hapto-visual environments has required a strong technical and programming background. The task of haptically rendering a 3D scene is tedious
since haptic properties must be assigned to individual objects in the scene and currently there are few high-level tools for accomplishing this task. HAMLAT bridges this gap by integrating into the HAML framework and delivering a complete solution for development of hapto- visual applications requiring no programming knowledge.
The remainder of the paper is organized as follows: in Section 2, we present the proposed architecture extensions and discuss design constraints. Section 3 describes the implementation details and how haptic properties are added and rendered within the Blender framework. In Section 4 we discuss related issues and future work avenues.
II. SYSTEM OVERVIEW AND ARCHITECTURE
The Blender design philosophy is based on three main tasks: data storage, editing, and visualization. According to the legacy documentation [3], it follows a data- visualize-edit development cycle for the 3D modeling pipe line. A 3D scene is represented using data structures within the Blender architecture. The modeler views the scene, makes changes using the editing interface which directly modifies the underlying data structures, and then the cycle repeats.
To better understand this development cycle, consider the representation of a 3D object in Blender. A 3D object may be represented by an array of vertices which have
been organized as a polygonal mesh. Users may choose to operate on any subset of this data set. Editing tasks may include operations to rotate, scale, and translate the
vertices, or perhaps a re-meshing algorithm to "cleanup" redundant vertices and transform from a quad to a triangle topology. The data is visualized using a graphical 3D renderer which is capable of displaying the object as a wireframe or as a shaded, solid surface. The visualization is necessary in order to see the effects of editing on the data. In a nutshell, this example defines the design philosophy behind Blender's architecture.
In Blender, data is organized as a series of lists and base data types are combined with links between items in each list, creating complex scenes from simple structures.
This allows data elements in each list to be reused, thus reducing the overall storage requirements. For example, a mesh may be linked by multiple scene objects, but the position and orientation may change for each object and the topology of the mesh remains the same. A diagram illustrating the organization of data structures and reuse of scene elements is shown in Figure 2. A scene object links to three objects, each of which link to two polygonal meshes. The meshes also share a common material property. The entire scene is rendered on one of several screens, which visualizes the scene.
We adopt the Blender design approach for our authoring tool. The data structures which are used to represent objects in a 3D scene have been augmented to include fields for haptic properties (e.g., stiffness, damping); user interface components (e.g., button panels) which allow the modeler to change object properties have also been updated to include support for modifying the haptic properties of an object. Additionally, an interactive hapto-visual renderer has been implemented to display the
3D scene graphically and haptically, providing the modeler or artist with immediate feedback about the changes they make to the scene. in the current version of the HAMLAT. the modifications to the Blender framework include: data structures for representing haptic properties,
an editing interface for modifying haptic properties, an external renderer for displaying and previewing haptically enabled scenes, scripts which allow scenes to be imported/exported in the HAML file format.
A class diagram outlining the changes to the Blender ramework is shown in Figure 3. Components which are ertinent to HAMLAT are shaded in gray. HAMLAT builds on existing Blender sub-systems by extending them or haptic modeling purposes. Data structures for representing object geometry and graphical rendering areaugmented to include field which encompass the tactile properties necessary for haptic rendering.
To allow the user to modify haptic properties GUI Components are integrated as part of the Blender editing panels. The operations triggered by these components
operate directly on the d ata structures used for representing hatic cues and may be considered part of the editing step of the Blender design cycle.
Similarly to the built-in graphical renderer, HAMLAT uses a custom rendlerer for displaying 3Ds scenes grphcal and haptcall, an is ineedn of the Blender renderer. This component is developed independently since haptical and graphical rendering must be performed simultaneously and synchronously. A simulation loop is used to update haptic rendering forces at a rate which maintains stability and quality. A detailed discussion of the implementation of these classes and their connectivity is given in the next section.
III IMLIEMENTATION
A Data Structure
A.1 Mesh Data Type
Blender uses many different data structures to represent the various types of objects in a 3D scene a vertices; a lamp contains colour and intensity values; and camera a object contains intrinsic viewing parameters.
The Mesh data structure iS used by the Blender inframework to describe a polygonal mesh object. It iS of particular interest for hapic rendering since many solid objects in a 3D scene may be represented using this type of data structure. The tactile and kinesthetic cues, which are displayed due to interaction with virtual objects, are typically rendered based on the geometry of the mesh. Hptic rendering is performed based primary on data stored in this data type. Other scene components such as lamps, cameras, or lines are not intuitively rendered using force feedback haptic devices and are therefore not of current interest for haptic rendering.
An augmented version of the Mesh data structure is shown in Figure 4. It contains fields for vertex and face data, plus some special custom data fields which allow data to be stored to/retrieved from disk and memory. We have modified this data type to include a pointer to a MHaptics data structure, which stores haptic properties such as stiffness, damping, and friction for the mesh elements (Figure 5).
A.2 Edit Mesh Data Type
It should be noted that the Mesh data type has a comPlimentary data structure, called EditMesh, which is used when editing mesh data. It holds a copy of the vertex, edge ,and face data for a polygonal mesh. when the user switches to editing mode, the Blender copies the data from a Mesh into an EditMesh and when editing is complete the data is copied back.
Care must be taken to ensure that the hapic property data structure remains intact during the copy sequence. The EditMesh data structure has not been modified to contain a copy of the hapic property data ,but this may
properties in edit mode is required). The editing mode is mainly used to modify mesh topology and geometry, not the haptic and graphical rendering characteristics,
A.3 Haptic Properties
In this section we'll briefly discuss the haptic properties which may currently be modeled using HAMLAT. It is important for the modeler to understand these
properties and their basis for use in haptic rendering.
The stiffness of an object defines how resistant it is to deformation by some applied force. Hard objects, such as a rock or table, have very high stiffness; soft objects, such as rubber ball, have low stiffness. The hardness-softness of an object is typically rendered using the spring-force equation:
Where the force feedback vector f which is displayed to the user is computed using ks the stiffness coefficient (variable name stiffness)for the object and x the penetration depth (displacement) of the haptic proxy into an object. The stiffness coefficient has a range of [0,1], where 0 represents no resistance to deformation and 1 represents the maximum stiffness which may be rendered by the haptic device. The damping of an object defines its resistance to the rate of deformation due to some applied force. It is typically rendered using the force equation:
Where kd is the damping coefficient (variable name}MHaptics; damping) anddepdt is the velocity ofthe haptic proxy as it;penetrates an object. The damping coefficient also has a range of [0,1] and may be used to model viscous behaviour of a material. It also increases the stability of the hapticrendering loop fordstiffmaterials.
The static friction (variable name stjriction) and dynamic friction (variable name dyjriction) coefficient are used to model the frictional forces experienced whilee
xploring the surface of a 3D object. Static friction is experienced when the proxy is not moving over the object's surface, and an initial force must be used to overcome static friction. Dynamic friction is felt when the proxy moves across the surface, rubbing against it.
Frictional coefficients also have a range of /0,1], with a value of 0 making the surface of a 3D object feel "slippery" and a value of 1 making the object feel very
rough. Frictional forces are typically rendered in a direction tangential to the collision point of the hapticproxy at an object's surface. B. Editing Blender uses a set of non-overlapping windows called spaces to modify various aspects of the 3D scene and its objects. Each space is divided into a set of areas andpanels which are context aware. That is, they provide functionality based on the selected object type. For
example, if a camera is selected the panel will display components which allow the modeler to change the focal length and viewing angle of the camera, but these components will not appear if an object of another type is selected.
Figure 6 shows a screen shot of the button space which is used to edit properties for a haptic mesh. It includes user-interface panels which allow a modeler to change the graphical shading properties of the mesh, perform simple re-meshing operations, and to modify the haptic properties of the selected mesh.
HAMLAT follows the context-sensitive behavior of Blender by only displaying the haptic editing panel when a polygonal mesh object is selected. In the future, this
panel may be duplicated to support haptic modeling for other object types, such as NURB surfaces. The Blender framework offers many user-interface components (e.g., buttons, sliders, pop-up menus) which may be used to edit the underlying data structures. The haptic properties for mesh objects are editable using sliders or by entering a float value into a text box located adjacent to the slider. When the value of the slider/text box is changed, it triggers an event in the Blender window sub-system. A unique identifier that the event is for the haptic property panel and the HAMLAT code should be called to update haptic properties for the currently selected mesh.
C Hapto-Visual Rendering
Blender currently support graphical rendering of scenes using an internal render or an external renderer (e.g., [4]). In this spirit, the haptic renderer used by HAMLAT has been developed as an exteral renderer. It uses the OpenGL and OpenHaptics toolkit [5] to perform graphic and hapic rendering ,respectively.
The 3D scene which is being modeled is rendered using two passes: the first pass render the scene graphically, and the second pass renders it haptically. The second pass is required because the OpenHaptics toolkit intercepts commands send to the OpenGL pipeline and uses them to display the scene using haptic rendering techniques. In this pass, the haptic properties of each mesh object are used much in the same way color and lighting are used by graphical rendering they define the
type of material for each object. To save CPU cycles, the lighting and graphical material properties are excluded from the haptic rendering pass.
Figure 7 shows source code which is used to apply the material properties during the haptic rendering pass. The haptic renderer is independent from the Blender
framework in that it exists outside the original source code. However, it is still heavily dependent on Blender data structures and types.
D. Scripting
The Blender Python (BPy) wrapper exposes many of the internal data structures, giving the internal Python scripting engine may access them. Similar to the data
structures used for representing mesh objects in the native Blender framework, wrappers allow user defined scripts to access and modify the elements in a 3D scene.
The hapic properties of a mesh object may be accessed through the Mesh wrapper class. A haptics attribute has been added to each of these classes and accessed through the Python scripting system. Figure 8 shows Python code to read the haptic properties from a mesh object and export to a file. Similar code is used to import/export HAML scenes from/to files.
An import script allows 3D scenes to be read from a HAML file and reproduced in the HAMLAT application; export script allows 3D scenes to be written to a HAML file, including haptic properties, and used in other
HAML applications.
The BPy wrappers also expose the Blender windowing system. Figure 9 shows a panel which appears when the user exports a 3D scene to the HAML file format. It
allows the user to specify supplementary information about the application such as a description, target hardware, and system requirements.
These are fields defined by the HAML specification [2] and are included with the authored scene as part of the HAML file format. User-interface components displayed on this panel are easily extended to agree with the future revisions of HAML.
The current version of HAMLAT shows that a unified modeling tool for graphics and haptics is possible. Promisingly, the features for modeling haptic properties
have been integrated seamlessly into the Blender framework, which indicates it was a good choice as a platform for development of this tool. Blender's modular architecture will make future additions to its framework very straightforward.
Currently, HAMLAT supports basic functionality for modeling and rendering hapto-visual applications. Scenes may be created, edited, previewed, and exported as part of a database for use in by other hapto-visual applications, such as the HAML player [6]. However, there is room for growth and in there are many more ways we can continue leveraging existing Blender functionality.
As per future work ,we plan to extend HAMLAT TO include support for other haptic platforms and devices.Currently, only the PHANTOM series of devices is supported since the interactive renderer is dependent on the OpenHaptics toolkit [5]. In order to support otherd evices, a cross-platform library such as Chai3D or
Haptik may be used to perform rendering. These libraries support force rendering for a large range of haptic hardware. Fortunately, due to the modularity of our implementation, only the interactive haptic rendering component need be altered for these changes.
In addition to support multiple hardware platforms, a user interface co
收藏