3D Printed Tactile Learning Objects: Proof of Concept

By Michael A. Kolitsky, Ph.D.

Dr. Kolitsky is an online adjunct professor in the Department of Biological Sciences at the University of Texas at El Paso. He has been retired for about 10 years but has been teaching online since 2000.

Abstract

Image-based learning objects produced today are primarily designed for the virtual world and are not easily accessible to blind or visually impaired students. The opportunity to 3D print images from disciplines such as anatomy, histology, cell biology, astronomy, and geology has opened new vistas for the construction of tactile learning objects made for the real world rather than the digital world. This paper provides a “proof of concept” for the idea of more widespread production of tactile learning objects mirroring the design of virtual learning objects, especially in STEM (science, technology, engineering, and math) disciplines where there is heavy use of images from microscopes, telescopes, and satellites. Evidence promoting the production of 3D printed braille is also provided. It is further suggested that many virtual learning objects may need to be redesigned using 3D printing so they may be made touchable for a more complete tactile learning experience.

Keywords

3D Printing, tactile learning objects, blind or visually impaired students, STEM, proof of concept

 

Introduction

3D printing is a common method used in engineering disciplines as a way to make a “proof of concept” model for testing the efficacy of a new mechanical device before building it in its final form. The work described in this paper utilizes the 3D printing proof of concept approach to explore its use in the construction of image-based tactile learning objects.

Learning objects are a central component for many virtual learning experiences and web sites such as Merlot1 and UWM Learning Objects Repository2 have grown to become the trusted data repositories of faculty-reviewed learning objects, freely downloadable for other faculty to use with confidence in the design of their student learning experiences. But most of the learning objects in Merlot, UWM Learning Object Repository, and other learning object repositories are in digital format and although reusable, they do not easily translate to a learning experience for the blind or visually impaired student. This is especially so for learning objects that have an image-based structural component at the heart of the learning experience, such as can be found in learning objects for many STEM (science, technology, engineering and math) disciplines.  It is challenging for a blind or visually impaired student to study learning objects for anatomy or for a microscope-based lab if the experience is only virtual. 3D printing, however, offers the opportunity to make the image-based structural component of a virtual or digital learning object into a real object that can be touched for a tactile learning experience (Jaquiss, 2012). How this can be accomplished with examples of 3D prints made from 2D images will be the focus of this paper.   

3D Printing Hardware/Software

Two avenues were followed for making 3D prints of images from anatomy, histology, cell biology, astronomy, and geology. Some 3D prints were made with the MakerBot Replicator 23 using natural color 1.75 mm PLA filament and others were made by sending the stereolithographic files (STL) to i.materialize4 in Belgium via their online ordering site. Turn-around time was approximately three weeks from time of file submission to i.materialize, with costs around $17.00 per print. The MakerBot Replicator 2 had a two-month wait time from initial order due to the arrival of Hurricane Sandy, but as of April 2013, the shipping time was approximately two weeks.

The type of file used for 3D printing is a stereolithographic file, or STL file, which consists of horizontal slices of the object being printed from bottom to top. The object is then created by adding one layer of plastic, or one slice, at a time in a process that is sometimes referred to as “additive manufacturing” (Jaquiss, 2012). There are several methods that one can use to make an STL file for 3D printing. The most common approach by those skilled in computer-aided design (CAD) is to make a 3D object in the virtual environment and then save that digital object as an STL file. However, there are other ways to make STL files that do not require knowledge of CAD. 123D Catch by AutoDesk5 provides a free downloadable app which can be used by an iPhone, iPad, or PC to capture images around an object. These images can then be sent to the AutoDesk cloud to be stitched together to form a 3D object with its own STL file for download. MakerBot has previewed news that a 3D scanner should be available by the end of 2013 with the capability to scan a real object to create an STL file for 3D printing. 

The software used for this project is PhotoToMesh6, which makes STL files from 2D photographs or JPEG files made on your computer. PhotoToMesh gives lighter colors or white a higher value on the Z-axis than darker colors or black, resulting in a 3D STL file. Some adjustment of colors in photographs, including changing to gray scale or even black-white, is sometimes necessary for the proper Z-axis placement of some regions in a photo. What this software does though is expand considerably what is possible to make into a 3D object for those of us who are not skilled in CAD. And, even if CAD is not a barrier to making an STL file, there are many subject areas in STEM which cannot be made into CAD files, such as images from light or electron microscopes, human anatomy dissections, astronomy views through telescopes, and satellite views of the earth. There is also an art to using the adjustable elements of the PhotoToMesh software to create an STL file of a certain size, Z-axis depth, or degree of smoothing to remove any anomalous peaks in the final mesh product. The resulting STL file was then loaded into the Replicator 2 using the free MakerWare software from MakerBot, which permitted further adjustments to the 3D object before the printing process began.

Proof of concept

The first thing that must be demonstrated is that 3D prints can be made from STEM 2D images so that they may then become the central component of a tactile learning object. Some examples of 3D Prints made from 2D photos can be found at the end of this report in Figures 1 – 6. Original images for these 3D prints came from the Anatomy and Physiology Online e-text produced by Primal Pictures7, the NASA Hubble Telescope Solar System Gallery8, the National Library of Medicine Virtual Human Project,9 and from the author’s own 2D image collection.

How do we use the 3D prints to make a tactile learning object?

A learning object is considered a learning tool that stands alone and is reusable, fitting together with other learning objects like Lego blocks. It is usually considered to be residing in the digital world and also has an assessment linked to it. So, how do we incorporate a 3D print into this configuration to create a tactile learning object? The 3D prints generated by PhotoToMesh are rectangular or square pieces of plastic with one side flat and the other side containing the tactile embossed-like duplication (similar to bas relief) of the 2D image from which it was made. A movement has begun to experiment with different ways to inform the blind or visually impaired student about what it is that should be learned and where structures of importance are on a 3D print. The use of braille in the design of a tactile learning object is important and it has been found that braille symbols can also be 3D printed. The first efforts in which PhotoToMesh was used to test the production of braille 3D prints showed raised areas that were rough and required sanding with fine sand paper. But, upon the suggestion of a representative of the MakerBot team, instead of 3D printing braille in a horizontal position on the build platform of the Replicator 2, it was 3D printed in the vertical position. This proved to be very successful in producing braille bumps that were quite smooth, yet able to provide the needed information for reading the braille text. Future construction of 3D tactile learning objects will utilize this method for producing instructions, tests, and information to be gained from a learning object now in tactile form. This first step in producing a tactile learning object is shown in Figure 7, which depicts a 3D print of the terms in braille used to identify the telophase stage of mitosis in whitefish blastula cells made from a microscope slide image. Figure 8 shows how three 3D prints can be arranged to form a tactile learning object.  The original 3D print (far left) shows what is viewed with a light microscope. The middle 3D print is in line graphic form to assist in the identification of the important structures to be learned on the original 3D print and the far right 3D print contains the terms to be identified in braille.

In addition to the use of braille, a LiveScribe pen10 will also be employed in future tactile learning object constructions to provide the option for audio to be substituted for braille. Text will also be included for teachers or others assisting in the learning process who do not read braille.

Implications for Practitioners and Families

This work demonstrates the feasibility of incorporating 3D prints of image-based materials that are not currently accessible to blind or visually impaired students in any course utilizing microscope or telescope images as a central component in a learning experience. Images of anatomic study at the human, or any level of biological study in which structure is studied, can now be 3D printed for a tactile learning experience. Courses in astronomy which have a heavy dependence on viewing objects in space such as planets and nebulae, as well as courses in geology viewing images of earth features taken from space by satellites, all now have a 3D printing route available to produce learning objects especially designed for the blind or visually impaired student. Space and earth science museums also can now expand their tactile displays for a more interactive experience. This work suggests that repositories of 3D print STL files for STEM studies will become available for download to be 3D printed for study in schools, universities, or even the home. And, since blind or visually impaired students should as much as possible have the same access to learning as sighted students, we should begin to see a future in which 3D printing in the K-16 environment and at home will play a greater role in making a wide variety of tactile learning objects.

References

Jaquiss, Robert (2012). Advanced Technology for Producing Tactile Materials. Braille Monitor, 55(4). Retrieved from https://nfb.org/images/nfb/publications/bm/bm12/bm1204/bm120407.htm

3D prints of light microscope slide images of whitefish blastula cells in mitosis with metaphase shown.   3D prints of light microscope slide images of whitefish blastula cells in mitosis with anaphase shown.

Figure 1. 3D prints of whitefish blastula cells in mitosis. Left 3D image is in metaphase while the right 3D image is in anaphase.

Compares accuracy of 3D print with image from a microscope slide of skeletal muscle from which the 3D print was made.

Figure 2. Light microscope striated muscle slide. Left image is the original slide image; right image is the 3D print.
Source: Primal Pictures Anatomy and Physiology Online E-text

3D print of Saturn made from Hubble telescope photo.

Figure 3. 3D print of Saturn made from a photo taken by the Hubble telescope. Source: Hubblesite from the Space Telescope Science Institute (http://hubblesite.org/newscenter/archive/releases/2001/15/image/g/)

3D print of neuromuscular junction slide which compares the accuracy of the 3D print with the 2D photo.  3D print of neuromuscular junction slide which compares the accuracy of the 3D print with the 2D photo.

Figure 4. 3D print (left image) of neuromuscular junction slide (right image). Source: Primal Pictures Anatomy and Physiology Online E-text

Compares accuracy of 3D print with image of transverse section of elbow from which 3D print was made.    Compares accuracy of 3D print with image of transverse section of elbow from which 3D print was made.

Figure 5. Transverse section of elbow of cadaver to show ulnar nerve (funny bone). Left image is a 3D print made from the JPG file on the right.
Source: Virtual Human Project at the National Library of Medicine

Shows 3D print of Mississippi Delta made from 2D photo taken by LandSat satellite.   Shows 3D print of Mississippi Delta made from 2D photo taken by LandSat satellite.

Figure 6. 3D print on left of image on the right made from LandSat satellite view of the Mississippi Delta. Source: NASA image created by Jesse Allen, using data provided by the University of Maryland’s Global Land Cover Facility. (http://visibleearth.nasa.gov/view.php?id=8103)

Shows 3D print of terms in braille to be used in identifying a tactile learning object.

Figure 7. 3D printed terms in braille for telophase mitosis stage (text added to displayed graphic in Photoshop).

Three 3D prints combined in a horizontal display as a first suggested way to combine the original 3D print of a cell in telophase of mitosis with a 3D printed line graphic and 3D printed braille list of terms to identify.

Figure 8. First attempt to combine three 3D prints for a tactile learning object in which students study a Whitefish blastula cell in the mitotic stage (known as telophase) to identify the structures particular for that mitotic stage. See Figure 7 for text translation of terms in braille. Left-most 3D print is the original 3D print with the most tactile sensitive structures to be identified. Middle image has been modified to simulate a line graphic and right-most image shows position of braille 3D text with braille terms matching where the lines on the middle graphic end on its right hand border.


The Journal of Blindness Innovation and Research is copyright (c) 2014 to the National Federation of the Blind.