Im proving Sonar Sensor F id elity in a R ob ot Sim ulator by Allan Edward Kranz B.Sc., University of Northern British Columbia, 2001 THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF MATHEMATICAL, COMPUTER, AND PHYSICAL SCIENCES IN COM PUTER SCIENCE UNIVERSITY OF NORTHERN BRITISH COLUMBIA September 2014 © Allan Edward Kranz, 2014 UMI Number: 1526521 All rights reserved INFORMATION TO ALL USERS The quality of this reproduction is dependent upon the quality of the copy submitted. In the unlikely event that the author did not send a complete manuscript and there are missing pages, these will be noted. Also, if material had to be removed, a note will indicate the deletion. Di!ss0?t&iori Publishing UMI 1526521 Published by ProQuest LLC 2015. Copyright in the Dissertation held by the Author. Microform Edition © ProQuest LLC. All rights reserved. This work is protected against unauthorized copying under Title 17, United States Code. ProQuest LLC 789 East Eisenhower Parkway P.O. Box 1346 Ann Arbor, Ml 48106-1346 ii ABSTRACT It is slow and expensive to develop robot control systems using real robots. Simulation can provide the benefits of lowering the time and cost. In order to take advantage of the benefits of development in a simulator we need high fidelity representations of actual sensors. Sensors do not provide perfect data and simulations th a t use either perfect models or models th at are too simple will not translate well into the real world. This research introduces a sensor model th at overcomes some of the existing limita­ tions in current simulations and provides a methodology for developing both new models and corresponding testing regimes. An actual sensor is used in realistic situations to create authentic models th at more closely match the performance of the robot in the real world. A simple sonar sensor is tested against three generic obstacles and a realis­ tic software simulation model of its capabilities is created. The Simbad robot simulator is modified to use this model, a testing regime is created to validate the results, and improved performance over the existing model is achieved. iv TABLE OF CONTENTS A bstract 111 Table of Contents List of Tables Vll List of Figures vm Acknowledgments C hapter 1 Chapter 2 1 1 Introduction Overview Problem Statement Contributions Thesis Outline 2 3 4 Definitions and Literature Review Definitions Robots Control Architectures Fidelity Noise Literature Review General General Sonar Sensors Models Existing SRF04 Sonar Sensor Models Current Simulators Commercial Software Simulators with No Sonar Capability Abandoned or Unavailable Some Capability Why Simbad? Open Source and Free Suitability for Improvement Heavily Used W ritten in Java v 5 6 6 6 7 7 10 10 11 11 12 14 16 17 18 18 18 19 Chapter 3 The Equipment The SRF04 Sonar Module The Microprocessor The Obstacles 20 20 21 23 C hapter 4 The Experiments The Sensor and Obstacles The Sonar Sensor The Square Obstacle Experiment The Round Obstacle Experiment The Angle Obstacle Experiment The Software Introduction Original Software Software Modifications Determining Detection SimBad Organization Testing the Modified Software 25 25 25 27 28 30 32 32 33 33 34 36 39 Chapter 5 The Results Introduction Using The Regular Sensor Model Using Enhanced Sensor Model Comparing the Results Error Determination Error Calculations Error Summary 44 44 45 45 46 46 47 49 Chapter 6 Conclusions 50 Chapter 7 Future Work Physics Based Systems Empirically Determined Systems API for Sensor Modeling 52 53 53 54 Bibliography 55 Appendix A 59 60 63 Sonar.c AngleTargetShadow.java vi LIST OF TABLES Summary of Errors LIST OF FIGURES 1.1 Devantech SRF04 Sonar 3 3.1 3.2 3.3 3.4 3.5 3.6 3.7 3.8 3.9 Devantech SRF04 SRF04 Timing Sonar Setup Microprocessor Development Board C18 Compiler Square Aluminum Round Aluminum Angle Aluminum 20 20 21 21 22 22 23 23 23 4.1 4.2 4.3 4.4 4.5 4.6 4.7 4.8 4.9 4.10 4.11 4.12 4.13 4.14 4.15 4.16 4.17 4.18 4.19 4.20 4.21 4.22 Sonar Sensitivity Plot from Devantech Ltd. Orientation of Square Obstacle Square Obstacle Plot Orientation of Round Obstacle Round Obstacle Plot Orientation of Angle Obstacle Angle Obstacle Plot Example of Overlap Example of No Overlap Basic Simbad Layout Derived from BlockWorldObject Derived from Shape3D Square Version Round Version Angle Version Sample Log D ata Square Original Trace. Round Original Trace. Angle Original Trace. Square Improved Trace. Round Improved Trace. Angle Improved Trace. 26 27 28 29 30 31 32 35 35 36 37 37 38 38 39 40 40 41 41 42 42 43 5.1 5.2 Threshold Detection Diagrams Traces for Unimproved Model. 45 45 viii Traces for Improved Model. Error Determination ACKNOWLEDGMENTS I wish to acknowledge the support of ray original supervisor, Dr. Charles Brown, whose patience has finally been rewarded. The support and encouragement he provided made the difference in completing this thesis. Dr. Liang Chen, my co-supervisor who took over when Charles retired, is owed a huge thanks for his perseverance in putting up with my interm ittent work schedule. My wife Dee has been behind me all the way and her support, along with my daughter Iliana, has made this possible. Many nights and weekends were sacrificed to produce this thesis. Chapter 1 Introduction 1.1 O verview Computers are already pervasive in modern Western society and robots are slowly becom­ ing so as well [25]. It is slow, expensive, and potentially dangerous [2] to develop robots in real world situations, so simulations will play an increasingly im portant role in the creation and validation of robot programming. NASA1, in their on line documentation [22] for the Robonaut 1 states th a t it is always risky to test unverified control algorithms on robotic hardware and th at a simulation could fulfill the need to minimize risk. Mobile robots observe their environments through the medium of their sensors, so simulations should strive to model sensors to the highest level of fidelity [12]. However, modeling sensors using first principles is a difficult and time-intensive task, so most simulations make use of simplified, and often idealized, methods to reproduce sensor outputs [12]. For simulations to provide accurate and realistic training and testing the sensors used in the simulation should act as closely to real sensors as possible. The robot controller is essentially the brain of the robot [3]. The usual assumption 1National Aeronautics and Space Administration 1 is th at the information provided to the brain is a somehow perfect representation of the outside world but this is not necessarily the case [13]. If some noise or imperfection in the d ata supplied by the sensors is acknowledged then it is usually assumed th a t it has some particular mathematical property like being Gaussian2 with clear mathematical properties [14] or being amenable to first order filters of some type [7]. These assumptions are used in robots, particularly for those used in simulations, because they make dealing with assumed corruption or inaccuracy of sensor data a simple process. 1.2 P roblem Statem en t The problem arises with the realization th a t real sensors do not act in simple predictable ways. The assumption th a t a simple model for sensors provides for a good simulation quickly falls apart. Brooks [4] noted th a t there would be a near certainty th a t programs th at work well on simulators would fail in real robots because it is hard to simulate the dynamics of the real world. Jakobi [17] also notes th a t unrealistic sensor models will fail to work in reality. This idea is repeated by Davison [7] where he notes th a t for a model to be useful we must know the range of error in the model we are working with. D ittm ar [10] notes th a t the correlation between the simulated EyeBot3 and the real EyeBot he worked with was never tested and this led to a need to produce a detailed error model th a t compared with the real counterpart. 2The Gaussian or Normal distribution is a very commonly occurring continuous probability distribu­ tion. 3EyeBot is a controller for mobile robots with wheels, walking robots or flying robots. It consists of a powerful 32-Bit micro controller board with a graphics display and a digital gray scale or color camera. The camera is directly connected to the robot board (no frame grabber). This allows us to write powerful robot control programs without a big and heavy computer system and without having to sacrifice vision - the most important sensor. 2 1.3 C ontributions A new and more realistic simulation model for a popular and useful sonar sensor, the Devantech Ltd. SRF04, is the major contribution of this research. Currently no standard test suites exist for sensors used on robots or in simulations so the test suites developed for the SRF04 are additional contributions of this research. There are also no current standards for testing simulation fidelity of sensors so the tests developed are a further contribution of this research. New hardware was evaluated, assembled, and tested for use in this research and it remains available for other researchers to use. The different approach used was to create a model based on the real performance of an actual sonar sensor in the real world. No assumptions were made about how sonar works or how it interacts with the environment. A number of typical situations were created and a real SRF04 sonar sensor was tested and the d ata were used to create a model for use in the Simbad robot simulator. Figure 1.1: D evantech SR F 04 Sonar This improved model would be useful for both researchers and industrialists using a SRF04, or similar, sonar sensor in robot development. The new model provides significant increases in the accuracy of the Simbad simulator and allow for a b etter translation to a real world machine. 3 1.4 T hesis O utline In this thesis we examine some uses of sonars in robotics and how a good model for simulation is a powerful tool to have available. We also construct a model for a current simulator and examine some areas of its performance. Chapter 1 provides a brief introduction to the problem and the importance of good robot simulations. Chapter 1 also outlines the contributions I have made in this thesis. Chapter 2 examines the current literature about sonar sensors and their use in robotics and simulations. I note some problems encountered by previous researchers and examine why there is a strong need for a good sonar model in robotic simulation. Chapter 3 looks at the equipment used to construct the sonar models used in our improved simulation. The SRF04 sonar sensor used, the PIC18F4320 microprocessor, the Simbad simulator, and the obstacles chosen are examined. Chapter 4 describes the experiments performed. This was a three stage process: determining the empirical properties of the sensor itself, integrating this information into the Simbad simulation software, and testing the new simulation against the original version to determine how much more realistically the new version performed. Chapter 5 discusses the results obtained form the experiments. The results achieved are detailed in Section 5.4. Chapter 6 draws the conclusion th a t incorporating some of the properties of the sensor into the actual obstacle allowed more realistic interaction between the sensor and the obstacle. The new simulation reduced the error by 34% over the previous version. Chapter 7 predicts th at the need for realistic sensor models will grow with time as robots become more common and more sophisticated. discussed. 4 Three possible directions are Chapter 2 Definitions and Literature R eview 2.1 D efinitions 2.1.1 Robots A robot for the purposes of this paper is an autonomous agent th at consists of a set of sensors, a set of actuators, a locomotion system, an on-board computer, and a control system th a t makes it all work. The control system is typically software th at runs on the on-board computer, but does not have to be; some of the control may be off-loaded onto subordinate processors or electro-mechanical subsystems, both located on the robot itself. By autonomous, we mean th a t the robot has the ability to sense the situation it is in and act appropriately. It must possess the capability to make these decisions solely on the input from its set of sensors without any extra information from any outside sources. It must also do all of the processing required by itself and have no reliance on any outside processing capabilities. 5 2.1.2 Control Architectures The control systems used by robots refer to the ways in which the sensing and action of a robot are coordinated. These systems fall across a broad but well defined spectrum of possibilities. This spectrum includes Reactive control (do not think just act or react), Behavior-based control (think the way you act), and Deliberative control (think a lot then act later). Spread over this spectrum are various hybrid control systems including some where thinking and acting are done separately but in parallel. 2.1.3 Fidelity Fidelity is the ability of the simulated sensor to accurately model the capabilities of the real-world sensor. The fidelity required for one particular purpose may not be the same as the fidelity required for a different purpose but, whatever level is required, working with sensor models th a t do not provide sufficient fidelity is a recipe for failure. 2.1.4 Noise Noise in a robot sensor context is unwanted signal variations th a t cause inaccurate rep­ resentation of the actual world th at the robot finds itself in. In a typical simulated environment the information provided by the sensors is usually perfect and can be relied upon to give an accurate rendition of the external world. Even when noise is incorporated into the system to provide more realism it is usually noise with some known statistical properties like having a mean of zero and being normally distributed [1]. In real-life applications noise can be random, bursty, or distributed in irregular fashions th at make removal difficult to impossible and/or extremely time consuming. 6 2.2 Literature R eview Despite the fact th a t sonar sensors have been around for decades surprisingly little re­ search has been done on how to model them in simulations [12]. 2.2.1 General A ttem pts have been made to incorporate sensors into robot simulators before but have met with varied success. Some researchers have used genetic algorithms and evolutionary programming [5] to make sensors work in their simulations; others have tried m athem ati­ cal models [20] of sensors. Simplistic, in the sense of being easy to program, models have been tried [12] but have not led to satisfactory results. In 1986 Kuc and Siegel [19] proposed a model of a sonar simulator th a t combines concepts from the fields of acoustics, linear system theory, and digital signal processing to simulate sonar, but assumes th a t all objects provide mirror-like reflective qualities. Almost all obstacles do not provide this level of reflectivity. The processing time to build the map of the surroundings also exceeded ten minutes per iteration making this model unrealistic for use in the real world [19]. The Polaroid sonar transceiver they based their model on is no longer in production. In 1988 Zelinsky [27] used real sonars to build a map of a robot’s surroundings using straight line segments but did not extend th a t to a simulator or make any mention of how to build a model for simulation. In 1992 Brooks [5] explores the difficulties involved in transferring programs evolved in a simulated environment to run on a real robot. Brooks also makes note th a t there can be a vast difference between the interactions with the environment for simulated and real robots and th a t this could lead to failure of the real robots in practice. Brooks says 7 th a t the number of trials needed to produce a robot using evolutionary programming techniques preclude the use of real robots and forces the use of simulations. He stresses the need for realistic sensor models in order for the evolutionary systems to solve the real problems and not be misled into creating solutions for problems th a t do not, in fact, exist in the real world. However Brooks does not provide any guidance on how to construct a good model and simply notes th at final tuning may need to be done by hand. A 1994 paper by Kortenkamp et al. [18] shows the need for realistic sensors. Our experience demonstrates an important lesson in mobile robotics - i f the low level sensing of the world is not working correctly, then the high level reasoning or map making will be unsuccessful, no m atter how elegant their implementations. In 1996 Dudek [11] at the McGill Research Center for Intelligent Machines noted while sonars are ubiquitous, models for them are not. Although sonar has become a ubiquitous sensor in mobile robotic systems, surprisingly few results are available that accurately model the typical behavior of the sensor under such conditions. In 1997 Yang et al. [26] made note th a t most simulators of the time were concerned solely with the robot and did not model sensors at all, In the past, most simulation and animation systems utilized in robotics were concerned with simulation of the robot and its environment without simula­ tion of sensors. These systems have difficulty in handling robots that utilize sensory feedback in their operation. and they propose a sensor fusion model including sonar using a m athem atical error model but do not implement it or any part of it. They propose three noise models for an ultrasonic sensor but only propose an algorithm to deal with these noise models. In 2004 O ’Sullivan [23], from the University of Limerick in Ireland, said th a t of the two models he analyzed the first used a Gaussian distribution to model its sonar sensors 8 and the second used a Normal distribution but neither were tied to a real sensor at all. Two sonar models are quantitively contrasted in this paper. The first is the two dimensional Gaussian sonar model proposed by Moravec and Elfes1. The second is the sonar model designed by Konolige2, the multiple target model, which is based on the normal distribution. In 2008 Kyriacou et al. [20] said th a t simulators are useful tools for developing robot behavior. However they also said th a t there are considerable differences between the behavior of the robot in the simulator and the behavior of the robot in the real world. They used real sonar sensors to construct a mathematically explicit model. They used NARMAX3 polynomials but these are specific to each obstacle and sensor combination. Kyriacou [20] also said th a t a lack of theoretical foundations for mobile robotics necessi­ tated using idealistic and simplistic sensor models. From the US Army Engineer Research and Development Center and the National Robotics Engineer Center, Philip Durst et al. [12] lamented the lack of accurate models. In their 2011 paper they noted th a t simplistic sensor models can have a deleterious effect on the ability of the robot. The following quote sums up the situation nicely. As sensors are the medium through which mobile robots observe their envi­ ronments, it seems only intuitive that mobile robot simulations should strive to model sensors to the highest level of fidelity. However, modeling sensors using first principles is a difficult and time-intensive task, so most simula­ tions make use of simplified, and often idealized, methods to reproduce sensor outputs. While some effort has been given to quantifying the gap between simulation and reality for mobile robots, no work has been done specifically to address the shortcomings of these simplified sensor models. Furthermore, much of the research that does exist is outdated, with very little recent research available addressing the issue. ’Moravec, H. P., Elfes, A. 1985. High Resolution Maps from Wide Angle Sonar. Proceedings of the 1985 IEEE International Conference on Robotics and Automation. 2Konolige, K. 1997. Improved Occupancy Grids for Map Building. Autonomous Robots 4(4): 351367. 3The Nonlinear Autoregressive Moving Average model with eXogenous inputs (NARMAX) can rep­ resent a wide class of nonlinear systems 9 The problems evident in all the sensor models fall into two basic categories. The first category is overly simplistic models th a t th a t do not capture essential characteristics of the sensor. The second category includes models th a t do not match the sensor they are meant to simulate. 2.2.2 General Sonar Sensors M odels Some research has been conducted on sonar sensors similar to the one used in this thesis. Some have worked with the Polaroid sonar sensor; for instance Cao and Borenstein [6] built a phased array out of Polaroid 6500 sensors in 2002 and tested it. However they did not construct or give any guidance for creating a model for use in simulation. Paolini, Huber, Collier, and Lee [24] built a robot in 2011 using an array of 10 M axBotix LV-EZ1 ultrasonic range finder sensors. Their robot does develop a model for an array of sonar sensors th a t forms a localized map of a robot’s surroundings, but does not model individual sonar sensors. 2.2.3 Existing SRF04 Sonar Sensor Models There has been very little work done with the Devantech SRF04 sonar sensor despite this sensor being popular and widely used. Dihn and Inane [9] built and tested a robot using the SRF04 in 2009 but do not develop a model for the SRF04 and only say this about it. They also might detect false echoes or distance returned by the sensor which may not correspond to the actual distance to the object. This is especially true fo r indoor environments where the ping sound wave might get reflected from multiple objects. A [sic] simple solution to this problem is to average several sonar readings. 10 2.3 Current Sim ulators Current simulators do not do a good job of modeling realistic sensors. In particular the Microsoft Robotics Studio on line documentation states the basic problem very nicely. • Incomplete and Inaccurate Models A large number of effects in the real world are still unexplained or very hard to model. This means the programmer may not be able to model everything accurately, especially in real time. For certain domains, like wheeled vehicles, motion at low speeds is still a big challenge fo r simula­ tion engines. Modeling sonar is another. A survey of current simulators was performed and the simulators found were grouped as follows. 1. Commercial software with licenses th a t must be purchased. 2. Simulators with no sonar capability. 3. Simulators th a t have been abandoned or are unavailable for any reason. 4. Simulators with some capabilities. The simulators from the first group were rejected as candidates because they cost money and/or have proprietary code. The second group was rejected because they had no support for and in some cases no use for a sonar sensor model. The third group was rejected because they have been abandoned and do not appear to be used. The fourth group includes current free simulators with some type of sonar capability and these became the candidates for improvement in this study. 2.3.1 Commercial Software 1. MobotSim4 4http://www.mobotsoft.com/?page_id=9 11 Configurable 2D simulator of mobile robots. Features a graphical interface where robots and objects are easily configured and a built-in B A SIC editor for simulation development. 2. Camelot - Robot Offline Programming and Simulation Software5 Camelot has therefore developed a robot programming system fo r use in design, lay-out, production and maintenance o f work cells in integrated production systems. 3. anyKode Marilou Robotics Studio6 Complex robotic structures and simulation environments can be easily and rapidly constructed with editor’s CAD style interface. The most popular devices like panoramic spherical cameras, motors, servos, U S/IR /Laser sensors and others let you create realistic simulations. 4. Cyberbotics Webots7 A realistic mobile robots simulator that includes models for the Kheperas, Alice, E-puck, Hoap-2, KHR2-HV, Nao and other robots as well as many sensors and actuators, simulated cameras, infra-red sensors, force sensors, etc. The user can program virtual robots using a C / C++ or Java library. A 3D environment editor allows users to customize robotics scenarios. 2.3.2 Simulators with No Sonar Capability 1. BSim8 A behavior based robot simulator. BSim is designed to allow users to ex­ periment with behavior based programming techniques without requiring access to an actual robot. BSim enables users to create simple worlds of rigid objects and light sources and to program robots to interact with these 5h t t p :// w w w .c a m e l o t .dk/ 6 h t t p :/ / w w w .a n y k o d e .c o m / i n d e x .php 7h t t p :/ / w w w .c y b e r b o t i c s .com/ 8h t t p :/ / b s i m .s o u r c e f o r g e .n et/ 12 worlds. Various behaviors and tasks are built into the simulator to give users a feel for what can be accomplished with behavior-based program­ ming. BSim does not model sonar sensors. 2. RoboWorks9 RoboWorks is an easy to use software tool fo r 3D modeling, simulation and animation of any physical system. RoboWorks does not support sonar sensors. 3. Encarnagao Robot Simulator10 The project is composed of a software intended to simulate a robot capa­ ble of moving objects in a room. There is no hardware involved, just a sim ulation! EncarnaQao Robot does not have sonar sensors. 4. Mobile Robot Simulators11 To prepare a miniature robot soccer team to be able to play a game against a human controlled or another computer controlled soccer team. Mobile Robot Simulators does not have sonar sensors at all. 5. ThreeDimSim12 ThreeDimSim is a powerful 3D mechanics simulator and rendering pro­ gram. It enables you to realistically simulate 3D scenes, based on el­ ementary mechanics. Application fields include engineering, education and graphical authoring. 9http://w w w .newtonium.com/ 10http://www.encarnacao.com/e.index.htm u http://robotsimulators.8m.com/ 12http://www.havingasoftware.nl/ 13 ThreeDimSim does not model sensors of any kind. 6. Neuro-Evolving Robotic Operatives13 Neuro-Evolving Robotic Operatives, or NERO fo r short, is a unique com­ puter game that lets you play with adapting intelligent agents hands-on. Evolve your own robot army by tuning their artificial brains fo r challeng­ ing tasks, then pit them against your friends ’ teams in on line competi­ tions! Neuro-Evolving Robotic Operatives does not include sonar sensors. 7. Game to Simulator Conversion There are a number of games th a t can be converted to serve as a testing ground for Al. There are many requirements of Game Al; path-finding, strategy-making, working for/w ith/against the player. None of these incorporate sonar sensors. (a) Grand Theft Auto series Huge maps, driving/flying Al, path-finding. (b) USARSim USARSim is a modification for the game Unreal Tournament 2004. It has been used for Virtual Robo-Cup, general simulation, and it has been used to show how robots can rescue people. 2.3.3 Abandoned or Unavailable (a) Juice14 13h t t p :/ / n e r o g a m e .org/ 14h t t p :/ / w w w .n a t e w .com/juice/ 14 Juice is a software tool for the high-level design of robots, particularly robots that walk or use other non-wheeled methods of locomotion. You can add beams to the robot, connect them with hinges and sliders, and then motorize the hinges and sliders to make the robot walk (or slither). A n open-source dynamics engine provides realistic physics fo r whatever sort of robot you create. Juice appears abandoned; last login was 2006-10-05 06:33:27. The URL does not exist anymore. Juice did not support sonar sensors. (b) Bugworks 2D Robot Simulator15 A free 2D robot simulator written in Java with cut-and-paste user interface. Bugworks is no longer available. Checked 11 Feb 2012, message Access re­ stricted / Directory listing suppressed (c) Multi-Body Simulator (MBSim)16 A Multi-Body Simulator (M BSim) is an object-oriented system that models, simulates, and animates the kinematics and dynamics of robotic arms and vehicles. This system creates a three-dimensional graphical environment which can be used as a powerful tool in robotic design and control. In addition, M BSim incorporates range finder as well as ultrasonic sensor models to yield environmental feedback. In fact, a motivation fo r developing MBSim stemmed from the realization that most commercially available packages are not as flexible as is often needed. For example, while it is rare fo r a typical system to han­ dle industrial robots AND vehicles, it is quite uncommon fo r them to be able to handle sensors as well as a mechanism’s kinematics and dynamics. While MBSim did include sonar sensors, this project appears abandoned; it was last updated on March 9, 1996. 15h t t p ://w w w .c o g s .s u s x .a c .uk/users/christ/bugworks/ 16h t t p ://w w w .i m d l .g a t e c h .edu/ultrasonic/index.html 15 2.3.4 Some Capability (a) Simbad17 Simbad is a Java 3D18 robot simulator for scientific and educational purposes. It is mainly dedicated to researchers/programmers who want a simple basis for studying Situated Artificial Intelligence, Machine Learning, and more generally A I algorithms, in the context o f A u­ tonomous Robotics and Autonomous Agents. It is not intended to provide a real world simulation and is kept voluntarily readable and simple. Simbad has sonar sensors arranged in a horizontal belt but only uses Java3D region clipping for detection threshold. It does not model any particular sensor in use today. (b) Microsoft® Robotics Studio 19 Microsoft Robotics Developer Studio 4 Beta 2 (RDS) is a Windowsbased environment fo r hobbyist, academic and commercial developers to create robotics applications fo r a variety of hardware platforms. R D S includes a lightweight REST-style, service-oriented runtime, a set of visual authoring and simulation tools, as well as tutorials and sample code to help get started. The built-in model is a generic model20 th a t enables you to access d ata from a sonar sensor, including information about the current distance measurement and angular range and resolution. It not meant to model any particular sensor in use today. (c) Player/Stage21 17h t t p : / / sim bad. so u r c e fo r g e . n e t / 18Java 3D is a scene graph based 3D application programming interface (API) for the Java platform. It runs atop either OpenGL or Direct3D. Java 3D is currently developed under the Java Community Process. 19h t t p ://m sd n . m ic r o s o ft. c o m /e n -u s/lib ra ry /b b 8 8 1 6 2 6 . aspx 20h t t p ://m sd n . mic r o s o ft.c o m /e n -u s /lib r a r y /d d l2 6 8 7 2 .a s p x checked 1 Feb 2013 21h t t p :/ / p l a y e r s t a g e . s o u r c e fo r g e . n e t / 16 The Player Project creates Free Software that enables research in robot and sensor systems. The Player robot server is probably the most widely used robot control interface in the world. Its simulation back ends, Stage and Gazebo, are also very widely used. Released under the GNU General Public License, all code from the Player/Stage project is free to use, distribute and modify. Player is developed by an in­ ternational team of robotics researchers and used at labs around the world. Player/Stage does allow for programmer created noise but does not have built in support for a sonar model. Software is Open Source so modification would be possible. (d) MOBS - Mobile Robot Simulator22 M O BS is a fully 3-dimensional simulation system fo r mobile robot sys­ tems. The simulator understands the same A S C II sequences as the Robuter II robot. The simulator can be connected to a robot applica­ tion program even without re-compilation of the application program. Sensors modeled are odometry, bumpers, sonar sensors, and camera view (using the S G I’s Inventor library). This robot is a commanddriven vehicle with up to 24 ultrasonic-sensors at the sides and cam­ eras attached to the working-plate. I f you don’t have a real Robuter or at least the manuals this program may be useless! MOBS has sonar sensors with probably the best model currently available. It is based on an elementary physics engine using time of flight. However, the on-line notes23 say without a real Robuter the simulation is useless. 2.4 W hy Simbad? Simbad was chosen from the four simulators with some capability because it is available and free, simple, heavily used. Previous students [21] have used Simbad successfully to simulate collision avoidance. 22http://robotics.e e .u wa .edu.au/mobs/ 23http://robotics.e e .u w a .edu.au/mobs/ftp/README 17 2.4.1 Open Source and Free The Simbad project is hosted on SourceForge. The Simbad simulator is free for use and modification under the conditions of the GNU General Public License [15]. In February of 2008, the entire Java 3D source code was released under the GPL version 2 license with GPL linking exception so all the software used is in the Open Source realm. This provides assurance th at code th at this thesis depends on will not suddenly disappear due to private business interests. 2.4.2 Suitability for Improvement Simbad’s sonar sensor model is too simplistic to provide accurate results but does make use of Java 3D thus allowing improvement using the entire capabilities of Java and Java 3D. Java 3D allows the use of three dimensional representation of sonar in the model. It also allows us to view the model from any direction or orientation. 2.4.3 Heavily Used Simbad has been downloaded 28,83124 times compared to 68,03925 for Player. No in­ formation was available from GitHub26 where the Player project has been released since early 2012. This indicates th a t as of early 2012 Simbad had about a 40 percent share of the open-source market for robot simulators. 24h t t p : // s o u r c e f o r g e .n e t /p r o j e c t s /s im b a d /f ile s / checked 24 March 2012 25h t t p : / / s o u r c e f o r g e .n e t /p r o j e c t s /p la y e r s t a g e /f ile s / checked 24 March 2012 26As of 24 March 2012 18 2.4.4 W ritten in Java The entire simulator is written Java and the University of Northern British Columbia uses Java for instructional purposes thus providing a significant source of technical support for Java programming at all levels. 19 Chapter 3 The Equipment 3.1 T he SR F 04 Sonar M odule This SRF04 Ultrasonic Ranger, manufactured by Devantech Ltd. has a range of 3cm to approximately 300cm. The SRF04 has a logic line used to trigger a pulse and the echo pulse is returned on a second line. Minimal power requirements and a compact, self-contained design make this a versatile range finder. Figure 3.1: D evantech SR F04 SRF04TuningDMgrwn Figure 3.2: SR F 04 T im ing 20 The SRF041 sonar sensor (see Figure 3.1) was mounted approximately 15cm above a flat hard floor and pointed horizontally, see Figure 3.3. The transm itter/receiver pair was oriented vertically to eliminate any distortion caused by having the receiver and transm itter separated horizontally. Horizontal operation would have had the receiver and the transm itter pointed in different directions. The difference would have been small but orientating them in a vertical line eliminates any difference in the direction they were pointed. Figure 3.3: Sonar Setup 3.2 T he M icroprocessor A Microchip ® PIC 18F4320 microprocessor was programmed to control the Devantech SRF04 sonar sensor. The microchip 18F4320 is a 10 MIPS (100 nanosecond instruction execution) CMOS FLASH-based 8-bit micro controller with a 77 word instruction set available a 40-pin or 28-pin package [16]. Figure 3.4: M icroprocessor 'The SRF04 was chosen because of its low cost and high availability and the fact the university robotics lab already had 12 of them 21 The 18F4320 was inserted into a Basic Micro 28/40 Development board and a wiring harness was constructed to attach the SRF04 sonar sensor to the 10 bus on the 28/40 development board. A ten-segment LED was used to provide the output status from the Microprocessor. Figure 3.5: D evelopm ent Board The program for the microprocessor was written using Microchip’s MPLAB C Com­ piler for PIC18 MCUs (MPLAB C18) compiler integrated into Microchip’s MPLAB In­ tegrated Development Environment (MPLAB IDE). The program consists of about 400 lines of C code with comments (see Appendix A). It uses the interrupt on change fea­ ture of the PORT B pins on the microprocessor to capture the flight time of the sonar chirps emitted by the SRF04 sonar sensor. It then displays the flight time on 8 segments of a LED in binary centimeters and transm its the distance over the serial port. The microprocessor fires the sonar approximately every 250 milliseconds. Figure 3.6: C18 Com piler 22 3.3 T he O bstacles The obstacles used were anodized aluminum extrusions of three different cross sections. The first one is a one inch square tube. The square represented both squares and rect­ angles and also provides insight into outside corners on any shape. Figure 3.7: Square A lum inum The second obstacle was a one inch round tube. The round tube is representative of any round obstacle and any obstacle with rounded corners. Figure 3.8: R ound A lum inum The third obstacle was a one by one by one sixteenth inch angle. The angle tube represented inside and outside corners. Figure 3.9: A ngle A lum inum 23 Anodized aluminum extrusions were chosen because they were consistent along their length and strongly reflective of sonar signals in air. 24 Chapter 4 The Experiments The experiments were conducted in three stages. The first stage determined the empirical properties of the sensor itself. The results are detailed in Section 4.1. The second stage integrated this information into the Simbad simulation software. This is detailed in the Section 4.2. The third stage tested the new simulation against the original version to determine how much more realistically the new version performed. The results from the tests are detailed in Section 4.3. 4.1 T he Sensor and O bstacles 4.1.1 The Sonar Sensor The SRF04 sonar sensor detects echoes from a fan-shaped area [8], see Figure 4.1. This area is a plot of constant echo intensity against the physical location of the echo producing object. Essentially this shows where the sonar detects a certain echo level. In order for the sensor to detect the object it must be within this area and reflect enough of an echo to be above the detection level of the sensor. 25 180 Figure 4.1: Sonar Sensitivity Plot from Devantech Ltd. In order to determine this detection threshold level for the obstacles th a t were used in this paper measurements were collected from the experiments detailed in Section 4.1.2, Section 4.1.3, and Section 4.1.4. These measurements were used to create a radial plot of the detection threshold of the sensor for three types of obstacles. Obstacles were placed vertically, see Figure 3.3, in front of the sensor and moved away from the sensor until it failed to detect the obstacle. The distance was recorded. The obstacle was rotated five degrees clockwise and the moved away until failure to detect reoccurred. These data were used to construct a detection area which indicated the area th at the sonar sensor would detect each obstacle. The results were plotted on a graph like the one in Figure 4.3 th a t we refer to as a detection threshold plot which shows the distance from the sonar sensor relative to the orientation of the obstacle. This shows the area th a t the sonar generator has to be inside of for the sonar to actually detect the square obstacle. Then the sonar has to be oriented correctly for the obstacle to be within the area of detection for the sonar as seen in Figure 4.1 in order for detection to occur. The overlap of these two things must occur at the 26 same time in order for the sonar to detect the object. 4.1.2 The Square Obstacle Experiment The square obstacle was placed vertically on a flat surface and the sonar sensor was placed on the same flat surface and slowly moved towards the obstacle. When detection was confirmed the sensor was slowly moved away from the square obstacle until the sensor failed to detect the obstacle. Failure to detect the obstacle was determined by noting when the sonar sensor reported a time-out indication on the range value. This distance was noted for the current orientation of the square obstacle. Figure 4.2: O rientation o f Square O bstacle The square obstacle was rotated five degrees clockwise and the sonar sensor was again slowly moved towards the obstacle until detection was confirmed and then slowly moved away until failure to detect occurred. This cycle was repeated 72 times to complete an 27 entire rotation of the square obstacle. Square Threshold Plot 10 15* ■ so® 325 320 315 310 305 300 295 290 205 200 275 i 270 265 260 255 250 245 240 235 230 225 100 105 110 115 120 125 130 135 140 220 .150 “ no, 135190105 180 175l7° 1t Figure 4.3: Square O bstacle P lo t As can be seen in Figure 4.3 the square obstacle reflected very little from the corners and had a minimum range of about 20cm. The range grew to about 250cm along the flat sides of the square obstacle where the echo was at its greatest intensity. The interesting areas are the deep indentations where the corners of the square obstacle failed to reflect much sound energy. 4.1.3 The Round Obstacle Experiment The round obstacle was placed vertically on a flat surface and the sonar sensor was placed on the same flat surface and slowly moved towards the obstacle. When detection was confirmed the sensor was slowly moved away from the square obstacle until the sensor failed to detect the obstacle. Failure to detect the obstacle was determined by noting when the sonar sensor reported a time out indication on the range value. This distance was noted for the current orientation of the round obstacle. 28 Figure 4.4: O rientation o f R ound O bstacle T h e ro u n d o b s ta c le w a s r o ta te d five d eg r ee s clo c k w ise a n d th e so n ar sen so r w a s a g a in slowly moved towards the obstacle until detection was confirmed and then slowly moved away until failure to detect occurred. This cycle was repeated 72 times to complete an entire rotation of the round obstacle. 29 Round Threshold Plot 325 320 315 310 305 300 295 290 285 tso 275 : 270 265 260 ■ 256 250 105 110 115 120 125 130 135 235 230 225 220 140 US 150 ,155 •160 180 Figure 4.5: R ound O bstacle P lot As expected, and as can be seen in Figure 4.5, the round obstacle reflected a constant value regardless of rotation and produced what was essentially a circular plot. The detection range was about two meters. 4.1.4 The Angle Obstacle Experiment The angle obstacle was placed vertically on a flat surface and the sonar sensor was placed on the same flat surface and slowly moved towards the obstacle. W hen detection was confirmed the sensor was slowly moved away from the square obstacle until the sensor failed to detect the obstacle. Failure to detect the obstacle was determined by noting when the sonar sensor reported a time out indication on the range value. This distance was noted for the current orientation of the angle obstacle. 30 Figure 4.6: Orientation of Angle Obstacle The angle obstacle was rotated five degrees clockwise and the sonar sensor was again slowly moved towards the obstacle until detection was confirmed and then slowly moved away until failure to detect occurred. This cycle was repeated 72 times to complete an entire rotation of the angle obstacle. Angle Threshold Plot 325 320 315 310 305 300 295 290 285 260 50 275 : 270 265 100 250 255 250 245 240 235 230 225 105 110 120 125 130 135 140 220 150 ,155 -160 ,95190185 190 Figure 4.7; A ngle O bstacle P lot The round obstacle and the square obstacle were centered on the axis of rotation but the angle obstacle was placed so th a t it would overlap with one half of the square obstacle thus placing it on one side of the center of rotation. This can be seen in Figure 4.6. This orientation was chosen to make the outer side of the angle obstacle match one half of the square obstacle as closely as possible. As seen in Figure 4.7 the angle produced the most complex plot with the interior angle reflecting the most intense echo. The maximum detection range was about three meters. The minimum was about 20cm, matching the square obstacle very closely. 4.2 T he Software 4.2.1 Introduction Simbad is a Java 3D ™ robot simulator for scientific and educational purposes [15]. Its major purpose is to provide a simple basis for studying general artificial intelligence al­ 32 gorithms, in the context of autonomous robotics and autonomous agents [15]. It was voluntarily kept readable and simple and thus suffers from the problems mentioned pre­ viously. Specifically, the sonar model is too simple to provide realistic results. 4.2.2 Original Software The lack of realism in Simbad is caused by the way the code1 works. It uses a large ver­ tically2 oriented cylindrical shape with a radius equal to an arbitrarily chosen maximum sensor range of 2.5 meters to intersect with the obstacle to generate the sonar detection and a radially oriented collection of small individual cylinders oriented horizontally3 to create directional information. 4 .2 .3 S o ftw a r e M o d ific a tio n s These plots created in Section 4.1.2, 4.1.3, and 4.1.4 were used to generate the Java 3D™ detection zones4 th a t the robot must be in for the sonar sensors to work. The current, detection zone tied to the sensor were be modified slightly to more closely match the zone in Figure 4.1. The detection zones were added to the obstacle objects in the Simbad simulation environment. The detection properties determined were incorporated into the obstacles included in the Simbad simulation and modified detection algorithm were created to take advantage of this new information. Modifications to the software were done in the update sensor methods in the simu­ lation loop of the program code. From documentations on the Simbad [15] website on S ou rce code is in sim bad.sim .R angeSensorB elt. java 2Simbad is written using Java 3D so the concepts of vertical and horizontal are somewhat arbitrary but vertical in this case means perpendicular to the plane the robots typically move in. 3Simbad is written using Java 3D so the concepts of vertical and horizontal are somewhat arbitrary but horizontal in this case means parallel to the plane the robots typically move in. 4These zones were implemented as Shape3D objects, see Appendix 7.3 for an example. 33 SourceForge each simulation step in the simulator performs the following operations for each agent (robot) alive in the simulated world. 1. check geometrical collision against other objects. 2. update the sensors (if any) according to current position and sensor rate. 3. update the actuators (if any) according to actuator rate. 4. call user provided method : performBehavior. 5. update position according to kinematic parameters (translation and ro­ tation velocities by default). As with real devices, sensors and actuators may not be updated on each frame depending on their update rate. 4.2.4 Determ ining D etection In the experiments, detection was accomplished by the robot moving to a position where the detection zone (see Figure 4.1) overlapped the obstacle at the center of the threshold detection plot and was inside the threshold detection zone (see Figure 4.3). Detection occurs when these happen simultaneously (see Figure4.8) and fails when one or more of these conditions are not true, see Figure 4.9. 34 Square Plot « 140 Figure 4.8: E xam ple of Overlap Square Plot 325 320 315 310 200 300 295 290 285 280 275 270 265 260 255 250 245 240 235 230 225 190 120 220 ,9Si90ias 180 jw f 150 150 Figure 4.9: E xam ple o f N o O verlap The detection zone was approximated with a shaped area and the sensor detection area was approximated with a cone. These cones can be seen in Figure 4.13 circling the robot on the right and the shaped area can be seen around the obstacle on the left. 35 4.2.5 SimBad Organization A basic outline of how Simbad is organized can be seen in Figure 4.10. The three major categories are SimpleAgent, Device and StaticObject. Agent is the base class for descendants th at model robots in the simulation. The Device base class represents both sensors and actuators in the simulation. The StaticObject class has descendants th at model immobile obstacles and constant things like the walls th a t bound the simulation plane. Agent MyRobot Camera! RanoeSenaorflett TYPE_SONAR TYPE BUMPER TYPE IR Figure 4.10: B asic Sim bad Layout Six new types of obstacles, see Figure 4.11, were derived from BlockWorldObject, a SquareObstacle, a RoundObstacle, and an AngelObstacle along with their matching SquareShadow, RoundShadow, and AngleShadow. These obstacles represent the three types of objects and their influence zones th a t the tests were carried out with. 36 stntadsim.BaseOfaftct ~s s4mb»dsim.BtodtWaridQb4«ct 5 — E 7T A A 5 AnglaOfecticU S q u ttv O b s ta d c Figure 4.11: D erived from B lockW orldO bject Six objects, see Figure 4.12, were derived from Shape3D, three to display the actual targets and three to represent the influence zones in the simulation. Figure 4.12: D erived from Shape3D The new view of the World window in Simbad looks like these examples taken from the test for the three new obstacles. These images show how the new sensors interact with the new influence zones. In Figure 4.13 the robot with its new sensors can be seen approaching a square obstacle. 37 Figure 4.13: Square Version In Figure 4.14 the robot can be seen approaching a round obstacle. □ ~ •ra id ; t «v Figure 4.14: R ound Version In Figure 4.15 the robot can be seen with a angle obstacle. 38 Figure 4.15: Angle Version Three robots were created, one for each scenario, as per the instructions included in the Simbad documentation. Each robot employed the same simple edge following algorithm used with the original sensors to circle the obstacles at maximum range. 4.3 T esting th e M odified Software The testing for improvement was done as follows. The simulated robot was directed to circle the obstacle at the limit of detection at a speed of 0.01 meters per second and the position was recorded at 0.05 second intervals. This was done twice, once with the robot using the original model and then again with the improved model. Sample data in Figure 4.16 shows how the position data were logged to a file. 39 Jan 06, 2013 7:19:54 PM examples.AngleRobot$MyRobot performBehavior INFO: 0.0,-2.5 Jan 06, 2013 7:19:54 PM examples.AngleRobot$MyRobot performBehavior INFO: 0.025000000372529037,-2.5 Jan 06, 2013 7:19:54 PM examples.AngleRobot$MyRobot performBehavior INFO: 0.04999929762332253,-2.4998125017522197 Jan 06, 2013 7:19:54 PM examples.AngleRobot$MyRobot performBehavior INFO: 0.07499648554845981,-2.4994375158033866 Figure 4.16: Sample Log Data These data were extracted from the log files and used to generate trace pictures showing the path taken by the robot during the simulation run. Each trace is a track of the robot’s path as it circled the obstacles. The plots produced are presented in Figure 4.17 and Figure 4.18 and Figure 4.19 to show how poorly the simulated sensors performed before the upgrade to the new model. As you can see from the plots the regular sensors simply produced a circular path around the obstacle ignoring the differences in the actual detection range of a real sonar sensor. Square Original 3 30! 300, 295 A 290 ^ 265 / 280 275 (— 270 |---265 r — 260 * 255 250 * 245 240 23! 'S q u a r e O r ig in a l 2 Figure 4.17: Square Original Trace. 40 Round Original ■Round O rig in al ■94 91 m T “ l7»7»6! IS O Figure 4.18: R ound Original Trace. Angle Original 0 A n g le O rig in al Figure 4.19: A ngle O riginal Trace. The improved version was used to produce plots for the same three obstacles using the same robot control code th at produced the previous three plots. The difference can be clearly seen Figure 4.20, Figure 4.21, and Figure 4.22. The tracks much more closely resemble the outline of the actual threshold plots produced when tested against the same obstacles. 41 Square Improved 30 32* 320 315 45 305 295 4 05 ■■Square Im p ro v ed 265 *■ * 260 255 v" 250 V 245 240' 235 230 ! j r l loo / 105 110 / 115 120 125 140 220 Figure 4.20: Square Im proved Trace. The round trace produced the least dramatic change but this was anticipated as the detection range of the real sensor on a round obstacle was constant. Round Im proved . 4 80 280 f~... • R o u n d Im p ro v e d 250 X 9%9 •include •include •include •include <8tdlib.h> unsigned int ticks » 0; unsigned chair temp; 60 tenp * PORTD; TRISD • ObOOOOOOOO; PORTD ■ 0; void lov_isr(void); void high.isr(void); I* • For PIC18 devices the low interrupt vector is found at * 00000018h. The following code will branch to the • low.interrupt.service.routine function to handle * interrupts that occur at the low vector. */ fpragna code lov_vector*0xl8 void interrupt_at.low.vector(void) _asa GOTO lov.lsr .endasm > fpragna code /♦ re tu r n to th e d e fa u lt code s e c tio n */ fpragna in te rru p tlo v l o v .i s r void l o v .i s r (void) { i f ( INTCQNbits.RBIF ■■ 1 ) / / In te rru p t ra is e d by change in P0RTB<7:4> ( / / sonar echo j u s t vent high so s t a r t clock i f ( P0RTBbits.RB4 — 1 ) > WriteTiaerO( 0 ); // sonar echo just went low so read clock if( P0RTBbits.RB4 — 0 ) { ticks » ReadTimerOO; //PORTD - ticks; > tenp • PQRTB; // Clear the mismatch on PORTB INTCONbits.RBIF » 0; // reset the interrupt flag temp • PQRTB; ADCQN1 - 0x07; TRISB - ObllllOOOO; OpenTimerO(TIMER.INT.OFF k T0.16BIT * TO.SOURCE.INT k TO.PS.1.1); UARTIntlnitO ; /« These macros must come AFTER the UARTIntlnitO function call */ U mDisableUARTTxIntO; // Disables transnit interrupt. nEnableUARTTxIntO; // Enables transmit interrupt. nDlsableUARTRxlntO; H Disables receive interrupt. // mEnableUARTRxIntO; // Enables receive interrupt. I I iSetUARTRxIntHighPriorO; I I S ets rece iv e in te r r u p t p r io r i ty to high. / / mSetUARTRxIntLowPriorO; / / S ets rece iv e in te r r u p t p r io r i ty to lo v . mSetUARTTxIntHighPriorO; 11 S ets tr a n s n it in te r r u p t p r io r i ty to high. 11 mSetUARTTxIntLowPriorO; / / S ets tr a n s n i t in te r r u p t p r io r i ty to lov. aSetUART.BRGHHighO; / / S ets BRGH b i t . 11 mSetUART.BRGHLowO; / / R esets BRGH b i t . mSetUART_SPBRG(12); 11 S ets SPBRG r e g i s t e r value as th e argument passed. //mSetUARTBaud(9600); 11 S ets th e baudrate of USART to th e argument passed. /*------In general, each interrupt source has three bits to control its operation. The functions of these bits are: - Flag bit to indicate that an interrupt event occurred * Enable bit that allows program execution to branch to the interrupt vector address when the flag bit is set - Priority bit to select high priority or low priority (most interrupt sources have priority bits) */ } // Check for any other low level interrupts > /• • For PIC18 devices the high interrupt vector is found at * OOOOOOOSh. The following code will branch to the * high.interrupt.service.routine function to handle • interrupts that occur at the high vector. /*--------The RCON register contains bits used to deternine the cause of the last Reset or wake-up from powermanaged mode. RCON also contains the bit that enables interrupt prioritise (IPEN). IPEN: Interrupt Priority Enable bit •/ 1 * Enable priority levels on interrupts fpragna code high.vector"0x08 void interrupt.at.high.vector(void) { .asm GOTO high.isr .endasm > fpragna code / / r e tu rn to th e d e fa u lt code sectio n fpragna interrupt high.isr void high.isr (void) { // Interrupt service routine supplied by the module.This need to be // called fron ISR of the nain program. UARTIntlSRO ; > fdefine STRING.SIZE 0 * Disable priority levels on interrupts (PIC16CXXX Compatibility mode) S void main( void ) { int distance * 0; char ascii [STRING.SIZE+l] ; char i; for(i«0; i The interrupt priority feature is enabled by setting the IPEN bit (RC0N<7>). Whan interrupt priority is enabled, there are two bits which enable interrupts globally. Setting the GIEH bit (INTC0N<7» enables all interrupts that have the priority bit set (high priority). Setting the GIEL bit (INTC0N<6>) enables all interrupts that have the priority bit cleared (lov priority). When the interrupt flag, enable bit and appropriate global interrupt enable bit are set, the interrupt will vector immediately to address 000008h or 000018h, depending on the priority bit setting. Individual interrupts can be disabled through their corresponding enable bits. When the IPEN bit is cleared (default state), the interrupt priority feature is disabled and interrupts are compatible with PIC mid-range devices. In Compatibility mode, the interrupt priority bits for each source have no effect. INTCQN<6> is the PEIE bit which enables/disables all peripheral interrupt sources. INTC0N<7> is the GIE bit which enables/disables all interrupt sources. All interrupts branch to address 000008h in Compatibility mode. RCQNbits.IPEN - i; /* Enable interrupt priority */ /* 61 RBIP: RB Port Change Interrupt Priority bit i » High priority 0 » Low priority itoa( distance, ascii ); for(i*0; KSTRING.SIZE; i++) < INTC0N2bits.RBIP - 0; /+-------- UARTIntPutChar(ascii[i]); ascii [i] * ’ > RBIE: RB Port Change Interrupt Enable bit 1 - Enables the RB port change interrupt 0 ■ Disables the RB port change interrupt UARTIntPutChar(10); UARTIntPutChar(13); INTCONbits.RBIE - 1; PauseMilliSeconds(lOO); / * ------------------ > RBPU: PORTB Pull-up Enable bit 1 • All PORTB pull-ups are disabled 0 ■ PORTB pull-ups are enabled by individual port latch values INTC0N2bits-RBPU - 1; /*-------PEIE/GIEL: Peripheral Interrupt Enable bit When IPEN - 0: 1 • Enables all unnasked peripheral interrupts 0 • Disables all peripheral interrupts When IPEN - 1: 1 • Enables all lov-priority peripheral interrupts 0 * Disables all lov-priority peripheral interrupts INTCONbits.CIEL - 1; /*-------GIE/G1EH: Global Interrupt Enable bit When IPEN - 0: 1 • Enables all unnasked interrupts 0 * Disables all interrupts When IPEN - 1: 1 ■ Enables all high-priority interrupts 0 ■ Disables all high-priority interrupts INTCONbits.GIEH * 1; vhile(l) < / * ------------------ Fron the SRF04 manual we need a 10 us pulse minimum so lets send a 20 clock cycle pulse, about 20us. P0RTBbits.RB3 * 1; // Trigger the sonar DelaylOTCYO; DelaylOTCYO; PGRTBbits.RB3 - 0; /*-------From the SRF04 manual ve need an 8 cycle burst at 40KHz - takes 0.2nS, give it 1ms to make sure it is done then up to 36mS lor the echo plus 10ms makes 1+36+10 * 47ms minimum time between trigger pulses. Run at twice that for good performance */ PauseHilliSeconds(lOO); / * ------------------ We are using the 1.0MHz internal clock to drive TimerO with a 1:1 prescaler so ve are at a 1.0 MHz timer frequency. The speed of sound at STP is 331 n/a or 0.0331 cm/us. We end up with 0.0331 cm/tick or a factor of 30.2 tick/cm. The sound has to fly both ways so we get 60.4. //distance • ticks/60; // distance in cm distance • ticks/118; // distance in 2cm increments PORTD • distance; 62 > .2 A ngleTargetShadow .java package simbad.ain; import j avax.m edia.j 3 d .Appearance; import ja v ax .m ed ia.j3 d .C o lo rin g A ttrib u tes; import j avax.m edia. J3d. Geometry; import j avax.m edia. j 3 d .GeometryArray; import javax.m edia.J3d.P olygonA ttributes; import ja v ax .media. j3 d .T ranaparencyA ttributes; import javax.m edia.j3d.T riangleA rray; import j avax. vecmath. P o in t3 f; import j avax. vecmath.V ector3d; * ©author Allan Kranz */ public claee AngleTargetShadov extends javax.media.j3d.Shape3D { private Geometry geometry; private Appearance appearance; /* * * Dimension of the "detection shadow" */ Vector3d myCxtent - null; /** * */ public AngleTargetShadovO { myExtent * new Vector3d(0.0, 0.0, 0.0); geometry * createGeometryO; appearance * createAppearanceO; this.setGeometry(geometry); this.setAppearance(appearance); > * ©param extent * ©param appearance */ public AngleTargetShadov(Vector3d extent, Appearance appearance) { this.myExtent » extent; this.appearance * appearance; geometry - createGeometryO; thie.setGeometry(geometry); this.aetAppearance(appearance); > /* * Creates a geometry for a round target "shadow of detection" * * ©return The newly created geometry. * */ private Geometry createGeometryO { Point3f[] myCoordinates ■ { // Bottom new Point3f(0 OOOf, 0.OOOf, O.OOOf) new Point3f(0. OOOf, O.OOOf, 3.160f) new Point3f(0. 251f, 0.OOOf, 2.869f) new Point3f(0. OOOf, O.OOOf, O.OOOf) new Point3f(0. 251f, O.OOOf, 2.869f) new Point3f(0. 160f, O.OOOf, 0.906f) new Point3f(0. OOOf, O.OOOf, O.OOOf) new Point3f(0. 160f, O.OOOf, 0.906f) new Point3f(0. 223f, O.OOOf, 0.831f) new Point3f (O.OOOf new Point3f (0.223f new Point3f (0.123f new Point3f (O.OOOf new Point3f (0.123f new Point3f (0.093f new Point3f (O.OOOf new Point3f (0.093f new Point3f (0.080f new Point3f (O.OOOf new Point3f (o.oeof new Point3f (0.04€f new Point3f (O.OOOf new Point3f (0.046f new Point3f (O.OSlf new Point3f (O.OOOf new Point3f (0.051f new Point3f (0.057f new Point3f (O.OOOf new PointSf (0.0S7f new PointSf (0.092f new Point3f (O.OOOf new Point3f (0.092f new Point3f (0.098f new Point3f (O.OOOf new Point3f (0.098f new PointSf (0.260f new Point3f (O.OOOf new Point3f (0.260f new PointSf (0.344f new Point3f (O.OOOf new Point3f (0.344f new Point3f (0.432f new Point3f (O.OOOf new PointSf (0.432f new PointSf (0.773f new PointSf (O.OOOf new Point3f (0.773f new PointSf (0.886f new Point3f (O.OOOf new Point3f (0.886f new Point3f (2.630f new PointSf (O.OOOf new Point3f (2.630f new Polnt3f (2.700f new Point3f (O.OOOf new Point3f (2.700f new Point3f (2.670f new Point3f (O.OOOf new Point3f (2.670f new Point3f (0.492f new Point3f (O.OOOf new Point3f (0.4921 new PointSf (0.348f new Point3f (O.OOOf new Point3f (0.348f new Point3f (0.226f new Point3f (O.OOOf new Point3f (0.226f new Point3f (0.199f new Point3f (O.OOOf new Point3f (0.199f new Point3f (0.lS6f new Point3f (O.OOOf new Point3f (0.156f new Point3f (0.1311 new Point3f (O.OOOf new Point3f (0.1311 new Point3f (0.1231 new Point3f (O.OOOf new Point3f (0.1231 new Point3f (0.099f new Point3f (O.OOOf new Point3f (0.099f new Point3f (0.0391 new Point3f (O.OOOf new Point3f (0.0391 new PointSf (0.0691 63 O.OOOf, O.OOOf), O.OOOf, 0.831f), O.OOOf, 0.338f), O.OOOf, O.OOOf), O.OOOf, 0.338f), O.OOOf, 0.199f), O.OOOf, O.OOOf), O.OOOf, 0.199f), O.OOOf, 0.139f), O.OOOf, O.OOOf), O.OOOf, 0.139f), O.OOOf, 0.066f), O.OOOf, O.OOOf), O.OOOf, 0.066f), O.OOOf, 0.0€lf), O.OOOf, O.OOOf), O.OOOf, 0.061f), O.OOOf, 0.057f), O.OOOf, O.OOOf), O.OOOf, 0.057f), O.OOOf, 0.077f), O.OOOf, O.OOOf), O.OOOf, 0.077f), O.OOOf, 0.069f), O.OOOf, O.OOOf), O.OOOf, 0.069f), O.OOOf, O.ISOf), O.OOOf, O.OOOf), O.OOOf, 0.1501), O.OOOf, 0.1611), O.OOOf, O.OOOf), O.OOOf, 0.161f), O.OOOf, 0.157f), O.OOOf, O.OOOf), O.OOOf, 0.157f), O.OOOf, 0.207f), O.OOOf, O.OOOf), O.OOOf, 0.207f), O.OOOf, 0.156f), O.OOOf, O.OOOf), O.OOOf, 0.156f), O.OOOf, 0.230f), O.OOOf, O.OOOf), O.OOOf, 0.230f), O.OOOf, O.OOOf), O.OOOf, O.OOOf), O.OOOf, O.OOOf), O.OOOf, -0.234f), O.OOOf, O.OOOf), O.OOOf, -0.234f), O.OOOf, -0.087f), O.OOOf, O.OOOf), O.OOOf, -0.087f), O.OOOf, -0.093f), O.OOOf, O.OOOf), O.OOOf, -0.093f), O.OOOf, -0.082f), O.OOOf, O.OOOf), O.OOOf, -0.082f), O.OOOf, -0.093f), O.OOOf, O.OOOf), O.OOOf, -0.093f), O.OOOf, -0.090f), O.OOOf, O.OOOf), O.OOOf, -0.090f), O.OOOf, ~0.092f), O.OOOf, O.OOOf). O.OOOf, -0.092f), O.OOOf, -0.103f), O.OOOf, O.OOOf), O.OOOf, ♦0.103f), O.OOOf, -0.099f), O.OOOf, O.OOOf), O.OOOf, -0.099f), O.OOOf, -0.046f), O.OOOf, O.OOOf), O.OOOf, -0.046f), O.OOOf, -0.098f), new Point3f O.OOOf, O.OOOf, O.OOOf), nev Point3f 0.069f, O.OOOf, -0.0981), nev Point3f 0.0501, O.OOOf, -0.0871), oiu Point3f O.OOOf, 0.0001, O.OOOf), new Point3f 0.050f, O.OOOf, -0.087f), new Point3f 0.068f, O.OOOf, -0.1451), new Point3f O.OOOf. O.OOOf, O.OOOf), nev Point3f 0.068f, O.OOOf, -0.145f), new PointSf 0.0621, O.OOOf, -0.169f), nev Point3f 0.OOOf, O.OOOf, O.OOOf), nev Point3f 0.0621, O.OOOf, -0.169f), nev Point3f 0.2231, O.OOOf, -0.831f), nev Point3f O.OOOf, O.OOOf, O.OOOf), nev PointSf 0.2231, O.OOOf, -0.8311), nev PointSf O.lllf, O.OOOf, -0.6301), nev PointSf O.OOOf, O.OOOf, O.OOOf), nev PointSf O.lllf, O.OOOf, -0.6301), nev PointSf 0.2131, O.OOOf, -2.43lf), nev Point3f O.OOOf, o.ooor, O.OOOf), nev Polnt3f 0.213f , O.OOOf, -2.4311), nev Polnt3f O.OOOf, O.OOOf, -2.8801), nev Polnt3f O.OOOf, O.OOOf, O.OOOf), nev Point3f 0.0001, O.OOOf, -2.8801), nev PointSf -0.2161 , O.OOOf -2.4711), nev Polnt3f O.OOOf, O.OOOf, O.OOOf), nev PointSf -0.2161 0.0001 -2.4711), nev Point3f -0.316f 0.0001 -1.7921), nev Point3f O.OOOf, O.OOOf, O.OOOf), nev Point3f -0.3161 O.OOOf -1.7921), nev Point3f -0.1601 O.OOOf -0.5991), nev Point3f 0.0001, O.OOOf, O.OOOf), nev PointSf -0.1601 0.OOOf -0.5991), nev Point3f -0.0681 O.OOOf -0.1881), nev Polnt3f O.OOOf, O.OOOf, O.OOOf), nev Point3f -0.0681 O.OOOf -0.188f), nev Point3f -0.093f O.OOOf -0.199f). nev Point3f O.OOOf, O.OOOf, O.OOOf), nev PointSf -0.093f O .OOOf -0.1991), nev Point3f -0.0401 O.OOOf -0.069f), nev Point3f O.OOOf, 0.OOOf, 0.0001), new Point3f -0.0401 O.OOOf -0.069f), new PointSf -0.138f O.OOOf -0.197f), nev Point3f O.OOOf, O.OOOf, O.OOOf), nev Polnt3f -0.1381 O.OOOf -0.1971), nev Point3f -0.0511 O.OOOf -0.061f), nev Point3f O.OOOf, O.OOOf, O.OOOf), nev Polnt3f -0.0511 O.OOOf -0.061f), nev Polnt3f -0.127f O.OOOf -0.1271), new Polnt3f O.OOOf, 0.OOOf, O.OOOf), nev Point3f -0 . 1 2 7 1 O.OOOf -0.127f), nev Point3f -0.1381 O.OOOf -0.1161), nev Point3f O.OOOf, O.OOOf, O.OOOf), nev Point3f -0.138f O.OOOf -0.1161), nev Point3f -0.737f O.OOOf -0.5161), nev Polnt3f 0.0001, O.OOOf, O.OOOf), new Point3f -0.737f O.OOOf -0.5161), nev Point3f -0.987f O.OOOf -0.5701), new Point3f O.OOOf, O.OOOf, O.OOOf), nev Point3f -0.9871 O.OOOf -0.570f), nev Point3f -1.1421 O.OOOf -0.5321), nev Polnt3f O.OOOf, O.OOOf, O.OOOf), nev Point3f -1.1421 O.OOOf -0.532f), nev Point3f -1.6041 O.OOOf -0.547f), nev Point3f O.OOOf, O.OOOf, O.OOOf), new Point3f -1.5041 O.OOOf -0.547f), nev PointSf -1.6461 O.OOOf -0.414f), nev Point3f O.OOOf, O.OOOf, O.OOOf), nev Point3f -1.5451 O.OOOf -0.414f), nev PointSf -2.580f 0.OOOf -0.455f). new Point3f O.OOOf, O.OOOf, O.OOOf), nev Point3f -2.5801 O.OOOf -0.45Sf), nev Point3f -2.5101 0.OOOf -0.220f). nev Point3f O.OOOf, O.OOOf, O.OOOf), nev PointSf -2.510f O.OOOf -0.2201), nev PointSf -1.3001 O.OOOf O.OOOf), nev Point3f O.OOOf, O.OOOf, O.OOOf), nev Point3f -1.300f O.OOOf O.OOOf), new Point3f -1.2351 O.OOOf 0.1081), nev Point3f O.OOOf, O.OOOf, O.OOOf), nev PointSf -1.236f , O.OOOf, 0.108f), nev Point3f -2.1861, O.OOOf, 0.385f), nev Point3f O.OOOf, O.OOOf, O.OOOf), nev Point3f -2.186f, O.OOOf, 0.385f), nev Point3f -2.53H, O.OOOf, 0.6781), nev Point3f O.OOOf, O.OOOf, O.OOOf), nev Point3f -2.83lf , O.OOOf, 0.678f), nev Point3f -2.594f , O.OOOf. 0.944f), nev Point3f O.OOOf, O.OOOf, O.OOOf), nev Point3f -2.594f , O.OOOf. 0.944f), nev PointSf -2.520f , O.OOOf, 1.175f), nev PointSf O.OOOf, O.OOOf, O.OOOf), nev Point3f -2.520f, O.OOOf, 1.1761), nev Point3f -2.477f, O.OOOf, 1.4301), nev Point3f O.OOOf, O.OOOf, O.OOOf), new Point3f -2.477f, O.OOOf, 1.4300, nev Point3f -2.3261 , O.OOOf, 1.6291), nev Point3f O.OOOf, O .O O O f , O.OOOf), nev Point3f -2.3261 O.OOOf, 1.6291), new Point3f -2.3S9f 0.OOOf, 1.980f), nev PointSf O.OOOf, 0.OOOf, 0.OOOf), new Point3f -2.3S9f O.OOOf, 1.9801), nev Point3f -2.178f , O.OOOf, 2.178f), new Point3f O.OOOf, O.OOOf, O.OOOf), new Point3f -2.1781 , O.OOOf, 2.178f), nev Point3f -1.9801 O.OOOf, 2.3591), new Point3f 0.OOOf, O.OOOf, O.OOOf), nev Point3f -1.980f O.OOOf, 2.3591), nev Poiat3f -1.767f O.OOOf, 2.5231), nev Point3f O.OOOf, O.OOOf, O.OOOf), nev Point3f -1.7671 O.OOOf, 2.523f), nev Point3f -1.550f O.OOOf, 2.685f), nev PointSf O.OOOf, O.OOOf, O.OOOf), nev Point3f -1.5501, O.OOOf, 2.6851), nev Point3f -1.242f, O.OOOf, 2.6651), nev Point3f O.OOOf, O.OOOf, O.OOOf), new Point3f -1.2421, O.OOOf, 2.6651), new Point3f -1.0261 O.OOOf, 2.8191), new Point3f O.OOOf, O.OOOf, O.OOOf), nev PointSf -1.026f O.OOOf, 2.8191), nev Point3f -0.647f O.OOOf, 2.415f), nev Point3f O.OOOf, O.OOOf, O.OOOf), nev Point3f -0.647f, O.OOOf, 2.415f), nev Point3f -0.427f, O.OOOf, 2.423f), nev Point31 O.OOOf, O.OOOf, O.OOOf), new PointSf -0.4271 O.OOOf, 2.423f), nev Point3f -0.269f O.OOOf, 3.078f), nev PointSf O.OOOf. O.OOOf, O.OOOf), nev Point3f -0.269f O.OOOf, 3.078f), nev Point3f O.OOOf, O.OOOf, 3.160f), // Top nev Point3f(O.OOOf, 1.0001, O.OOOf), nev Point3f(O.OOOf, l.OOOf, 3.160f), nev Point3f(0.251f, l.OOOf, 2.8691), nev Point3f(O.OOOf, l.OOOf, O.OOOf), nev Point3f(0.2511, l.OOOf, 2.869f), nev Point3f(0.160f, l.OOOf, 0.906f), nev Point3f(O.OOOf, l.OOOf, O.OOOf), nev Point3f(0.160f, l.OOOf, 0.906f), nev Point3f(0.2231, l.OOOf, 0.83H), nev Point3f(O.OOOf, l.OOOf, O.OOOf), nev Point3f(0.223f, l.OOOf, 0.8311), new Point3f(0.123f, l.OOOf, 0.338f), nev Point3f(O.OOOf, l.OOOf, O.OOOf), nev Point3f(0.1231, l.OOOf, 0.338f), new PointSf(0.093f, l.OOOf, 0.1991), nev PointSf(O.OOOf, l.OOOf, O.OOOf), nev PointSf(0.093f, l.OOOf, 0.199f), nev Point3f(0.080f, l.OOOf, 0.1391), new Point3f(0.0001, l.OOOf, O.OOOf), nev Point3f(0.080f, l.OOOf, 0.139f), nev PointSf(0.0461, l.OOOf, 0.0661), nev Point3f(O.OOOf, l.OOOf, O.OOOf), new Point3f(0.046f, l.OOOf, 0.066f), nev PointSf(0.051f, l.OOOf, 0.0611), 64 l llllllill ll ll ll i li ll li i ll ll ll l ll li ll l li li ll ll l s i l ll ll ll l ll li ii f ll ll ii l il il ll l iiiilg iililiilliiliiiiiigiigiiiiiiiiilliiiliiiiiliiiigiiiiiiiiiiiiiiiiiggiiiii .. .. I I O I to t iiPii?iiPii?iPiIO • to ll!il!l!l!!l!lllllllll!!!!l!illi|!ll!llllllll!l||||||l|lll!||lllllllllilllllll new Point3f (-0.127f new Point3f (-0.I38f nev Point3f (-0.138f nev Point3f (-0.138f nev Point3f (-0.138f nev Point3f (-0.7371 nev PointSf (-0.138f nev Point3f (-0.7371 nev Point3f (-0.737f nev Point3f (-0.7371 nev Point3f (-0.737f nev Point3f (-0.9871 nev Point3f (-0.737f nev Point3f (-0.9871 nev PointSf (~0.987f nev Point3f (-0.987f nev Point3f (-0.9871 nev Point3f (-1.142f new Point3f (-0.987f nev Point3f (-1.142f nev Point3f (-1.1421 nev Point3f (-1.142f nev Point3f (-1.1421 nev Point3f (-1.5041 nev Point3f (-1.1421 nev Point3f (-1.504f nev Point3f <-1.504f nev Point3f (-1.504f nev Point3f (-1.504f nev PointSf (-1.5461 nev Point3f (-1.504f nev PointSf (-1.545f nev Point3f (-1.545f nev PointSf (-1.5451 new PointSf (-1.645f new PointSf (-2.580f nev Point3f (-1.545f nev PointSf (-2.5801 new Point3f (-2.5801 nev PointSf (-2.580f new Point3f (-2.S80f nev PointSf (-2.5101 nev PointSf (-2.680f nev Point3f (-2.5101 nev PointSf (-2.510f nev Point3f (-2.510f nev Point3f (-2.510f nev Point3f (-1.300f new PointSf (-2.510f new Point3f <-1.300f nev Point3f (-1.300f nev PointSf (-1.3001 new Point3f (-1.300f nev Point3f (-1.236f nev Point3f <-1.300f nev Point3f (-1.235f nev Point3f (-1.235f nev Point3f (-1.235f new Point3f (-1.235f nev Point3f (-2.186f nev PointSf (-i.235f nev Point3f (-2.186f nev Point3f (-2.186f nev Point3f (-2.186f new PointSf (-2.186f nev Point3f (-2.531f nev Point3f (-2.186f nev PointSf (-2.B31f nev Point3f (-2.531f new PointSf C-2.531f nev Point3f (-2.531f nev PointSf (-2.594f nev PointSf (-2.531f nev Point3f (-2.594f nev PointSf (-2.5941 nev Point3f {-2.594f nev Point3f (-2.594f nev Point3f (-2.520f l.OOOf O.OOOf l.OOOf l.OOOf O.OOOf O.OOOf l.OOOf O.OOOf l.OOOf l.OOOf O.OOOf O.OOOf l.OOOf O.OOOf l.OOOf l.OOOf O.OOOf O.OOOf 1.OOOf O.OOOf l.OOOf l.OOOf 0.OOOf O.OOOf l.OOOf O.OOOf l.OOOf 1.OOOf O.OOOf O.OOOf l.OOOf O.OOOf l.OOOf l.OOOf O.OOOf O.OOOf l.OOOf O.OOOf l.OOOf l.OOOf O.OOOf O.OOOf l.OOOf O.OOOf l.OOOf l.OOOf O.OOOf O.OOOf l.OOOf O.OOOf l.OOOf l.OOOf O.OOOf O.OOOf l.OOOf O.OOOf l.OOOf l.OOOf O.OOOf O.OOOf l.OOOf O.OOOf l.OOOf l.OOOf O.OOOf O.OOOf l.OOOf O.OOOf l.OOOf l.OOOf O.OOOf O.OOOf l.OOOf O.OOOf l.OOOf l.OOOf O.OOOf O.OOOf -0.127f), -0.116f), -0.116f), -0.116f), -0.116f), -0.516f), -0.116f), -0.5161), -0.5161), -0.516f), -0.516f). -0.5701), -0.516f), -0.5701), -0.570f), -0.570f), -0.570f), -0.5321), -0.570f), -0.532f), -0.532f), -0.532f), -0.5321), -0.5471), ~0.532f). -0.547f ), -0.547f), -0.547f), -0.547f), -0.414f), -0.547f), -0.414f), -0.414f), -0.414f), -0.4141), -0.455f), -0.414f), -0.455f). -0.455f), -0.4551), -0.4551), -0.220£), -0.455f), -0.220f), -0.220f), -0.220f), -0.220f), O.OOOf), -0.220f), O.OOOf), O.OOOf), O.OOOf), O.OOOf), 0.108f), O.OOOf), 0.108f), 0.108f), 0.108f), 0.108f), 0.385f), 0.108f), 0.38Sf), 0.385f), 0.38Bf) , 0.38S£), 0.678f), 0.385f), 0.678f), 0.678f), 0.678f), 0.678f), 0.944f), 0.678f), 0.944f), 0.944f), 0.944f), 0.944f), 1.175f) , new Points ~2.594f l.OOOf 0.944f), nev Points -2.520f O.OOOf 1.175f), nev Points -2.520f l.OOOf 1.175f), new Point3 -2.520f l.OOOf 1.175f), nev Points -2.5201 O.OOOf 1.175f), nev Points -2.4771 O.OOOf 1.430f), nev Point3 -2.520f l.OOOf 1.176f), nev Points -2.477f O.OOOf 1.430f), nev Points -2.477f l.OOOf 1.430f), nev Point3 -2.4771 l.OOOf 1.430f), nev Points -2.477f O.OOOf 1.430f), nev Point3 -2.326f O.OOOf 1.629f), nev Points -2.477f l.OOOf 1.430f), nev Points -2.326f O.OOOf 1.629f), nev Point3 -2.326f l.OOOf 1.629f), nev Points -2.326f l.OOOf 1 .629f), nev Point3 -2.326f O.OOOf 1.629f), new Point3 -2.359f 0.OOOf 1.980f), nev Points -2.326f 1.OOOf 1.629f), nev Points -2.359f O.OOOf 1.980f), nev Point3 -2.359f 1.OOOf 1.980f), nev Point3 -2.3591 l.OOOf 1.980f), nev Point3 -2.3591 O.OOOf 1.980f) , nev Points -2.178f O.OOOf 2.178f), nev Point3 -2.359f 1.OOOf 1.9801), nev Points -2.178f O.OOOf 2.178f). nev Point3 -2.178f l.OOOf 2.1781), nev Point3 -2.178f l.OOOf 2.1781), nev Point3 -2.178f O.OOOf 2.178f), nev Points -1.980f O.OOOf 2.359f), nev Point3 -2.178f l.OOOf 2.17Bf), nev Point3 -1.980f O.OOOf 2.3691), nev Point3 -1.980f l.OOOf 2.359f), nev Point3 -1.980f l.OOOf 2.3691), nev Point3 •1.980f 0.OOOf 2.3691), nev Point3 -1.7671 O.OOOf 2.6231), nev Point3 -1.980f l.OOOf 2.3691), nev Point3 -1.767f O.OOOf 2.523f), nev Point3 -1.767f l.OOOf 2.5231), nev Points -1.767f l.OOOf 2.523f). nev Point3 -1.767f O.OOOf 2.6231), nev Point3 -1.550f O.OOOf 2.6861), nev Point3 -1.7671 l.OOOf 2.5231), nev Points -l.S50f O.OOOf 2.6861). nev Point3 -1.550f l.OOOf 2.6851), nev Point3 -1.5501 l.OOOf 2.6851), nev Points -1.550f O.OOOf 2.6851), nev Point3 -1.2421 O.OOOf 2.665f), nev Point3 -1.5501 l.OOOf 2.685f), nev Points -1.242f O.OOOf 2.6651), new Point3 -1.242f l.OOOf 2.6651), nev Points -1.242f l.OOOf 2.6651), new PointS -1.242f O.OOOf 2.665f), new Point3 -1.026f O.OOOf 2.8191), nev Point3 -1.2421 l.OOOf 2.665f), nev Point3 -1.026f O.OOOf 2.8191), new Point3 -1.026f l.OOOf 2.819f), nev Point3 -1.026f l.OOOf 2.8191), nev PointS -1.0261 O.OOOf 2.8191), new Point3 -0.647f O.OOOf 2.4151), nev Point3 -1.026f l.OOOf 2.8191), nev Points -0.647f O.OOOf 2.4151), nev Points -0.647f l.OOOf 2.415f), new PointS -0.647f l.OOOf 2.4151), nev PointS -0.647f O.OOOf 2.4161), nev Points -0.427f O.OOOf 2.4231), nev Point3 -0.6471 l.OOOf 2.4151), nev Points -0.427f O.OOOf 2.423f), nev Point3 -0.427f l.OOOf 2.4231), nev Points -0.427f l.OOOf 2.4231), nev Point3 -0.427f O.OOOf 2.4231), nev Points -0.269f O.OOOf 3.0781), nev Points -0.4271 l.OOOf 2.4231), nev Point3 -0.269f O.OOOf 3.0781), nev Point3 -0.269f l.OOOf 3.0781), nev Point3 -0.269f l.OOOf 3 -078f), nev Point3 -0.269f O.OOOf 3.0781), nev Point3 O.OOOf, O.OOOf, 3.1601), 68 new Point3f(-0.269f, l.OOOf, 3.076f), new Point3f(O.OOOf, O.OOOf, 3.160f), new PointSf(O.OOOf, l.OOOf, 3.160f) >; TriangleArray myTriangles » new TriangleArray(myCoordinates.length, GeoaetryArray.COORDINATES); «yTriangles.«etCoordinate$(0, myCoordinates); return ayTrlangles; > /* * * * Oreturn Returns an appearance. */ private Appearance createAppearanceO { Appearance appearance * new AppearanceO; TransparencyAttributes transparencyAttributes * new TransparencyAttributesO; transparencyAttributes .setTraneparencyMode(TransparencyAttributes.BLENDED); transparencyAttributes.setTransparency(0.60f); appearance.setTraneparencyAttributes(transparencyAttributes); ColoringAttributes coloringAttributes ■ new ColoringAttributes( ColorSfConstant.BURLYV00D, 1); appearance.aetColoringAttributes(coloringAttributes); PolygonAttributes polyAppear » new PolygonAttributesO; polyAppear.setCullPace(PolygonAttributes.CULL.NONE); appearance.setPolygonAttributes(polyAppear); return appearance; > > 69