The IFU Planning Table 4.0 is an innovative tool that makes the current research results of tomorrow's factory planning tangible. A special added value of this unique tool is the possibility of participatory planning in which all relevant knowledge holders within the company are involved in the planning process and their planning results are combined with relevant internal and external company restrictions. The special feature: No specific factory planning knowledge is required for this interaction due to the intuitive operation of the role-specific information representation. This results in a valid layout with high planning speed and quality. The focus is on the integration of the implicit, undocumented knowledge of all relevant decision-makers into the planning process.
An intuitive operation of the table is realized through additive 3D models that are manufactured true to scale, and recognized by a round marker on the planning table. The 3D models serve as scale-oriented figures of the machines and systems that can be arranged on the markers. The planner uses the markers to arrange the objects by rotating and moving them and plan the factory step by step. Using the menu buttons, the planner is provided with object-specific knowledge from a central cloud that is linked to the objects via the marker ID. The markers are identified by triangulation on the highly precise touch surface of the planning table.
The role-based database management system consists of a cloud database that stores and updates the technical regulations for workstations and machine-specific data. This database links the database management system with the 3D objects. If the planning contravenes these specifications, the system assigns this information to an employee in the company and makes it actively available to them. Thus, the planning is continuously monitored for validity and provides the responsible employee with a structured overview of the open design instructions.
Motion capture concerns a tracking procedure with which movements are captured and then analyzed and processed with the aid of computers. The system consists of multiple infrared cameras that capture the commonly called targets which in turn, are attached to the different limbs of a test subject. As a result, the captured movement can be transferred in real-time to a mannequin in a simulation and ergonomically analyzed.
Motion capture was originally developed for the film industry in order to be able to create very natural motion sequences with animated figures. For example, this technology was successfully used in the character Gollum in the Lord of the Rings trilogy. But the application of motion capture is also increasing within industry.
The AgeMan aging simulation suit facilitates the vicarious sensations of bodily limitations: Thus, the field of vision, hearing and the motor skills are limited while wearing the suit. The AgeMan thus clarifies the need for an age-appropriate work design in a practical orientation. In addition, the suit is used to sensitize younger employees to age-related work restrictions. For Work 4.0, the examination of age and age equality is of great importance since the work content in the future will increase in demand and become more complex. With the application of AgeMan, potentials for Work 4.0 can be increased at an early stage and the implementation time of age-appropriate measures may even be shortened.
Exoskeletons, also referred to as outer skeletons or human support robots, are support structures worn on the body that reduce the burden on the body by means of (electro-)mechanical support and reduce the risk of injury. IFU works with the Chairless Chair, a passive exoskeleton. This is a mobile chair unit that is attached to the body of the employee and supports it during load changes with mechanical spring balancer or rope pull systems. By using the Chairless Chair, the employee can choose between a standing and a seated position during the working process. In addition, the exoskeleton is designed in such a way that it does not additionally burden the employees even when wearing several hours.
In light of the fact that musculoskeletal disorders are the primary reason for loss of work, the exoskeleton offers the following advantages: It relieves the employee's spine and reduces unilateral stress. As a result, this can reduce illness-related downtime days. Due to the portable technology – the employee does not need to take off the exoskeleton, it can be worn at all times – a more flexible work design is possible, so that a saving of time in the execution of the work activities can be expected. Thus, IFU offers practical application possibilities in order to design existing working systems ergonomically and future-oriented.
With a Force Feedback Device, ergonomics, accessibility and assembly simulations can be performed in various simulation environments in the early planning phase. The Force Feedback Device enables haptic feedback in all six spatial degrees of freedom, thus increasing the user's immersion to a new level. Through the large workspace, which is modelled for that of a human arm, and the highly demonstrable forces, the ergonomic loads can already be experienceable by the planner in the early phase. The added value of this technology is that processes can be digitally secured at an early stage of factory planning. An example of this is screw connections, which can be nearly realistically depicted in virtual reality due to the haptic feedback.
One possibility for creating age-appropriate workstations is the application of robots in assembly. This will, among other things, improve the ergonomics of the work processes, which will unburden the employees and counteract the consequences of demographic change. Nevertheless, for safety reasons, the strict separation of manual and automatic workstations is still a major obstacle. The solution for this is Human-Robot Collaboration (HRC). This direct cooperation between human and robot lays the foundation for innovative, future-proof production processes (Industry 4.0). In this way, humans and robots can form a collaborative working system. For further research in the area of HRC, a KUKA LBR iiwa is located in the Center of Excellence (see photo).
Due to a high sensitivity, the new generation of lightweight robots (the KUKA LBR iiwa, for example) enables flexible support in various assembly activities. For adhesive applications, for example, the commonly called impedance-mode can be used to carry the nozzle needle along the component geometry with force control. The laborious teaching and re-teaching of a multitude of individual points for the realization of path planning is no longer necessary. In this way, the quality of the adhesive seam and the process reliability of the adhesive application are significantly improved.
In order to simplify the programming and the maintainability of a human-robot collaboration, universally applicable program modules were developed that enable simple, but secure programming. For the definition of the robot movements, the five known basic building modules from the basic cycle of Methods Time Measurement (MTM) are used, which are then supplemented with additional process-related building modules depending on the mounted tool. With this module system, complete HRC applications can be realized easily and intuitively using drag-and-drop.
In 2017, the Franka Emika lightweight robot was distinguished in 2017 with the German Future Prize. It is modularly constructed and consists of ultra-light components. It disposes of sensitive sensors in all joints that in combination with a unique type of regulation, allows for a human-like flexibility and sensitivity. Through this, the robot is able to react to light contact extremely quickly. This prevents people from being injured by accidental collisions with the robot. The operation of the system is intuitively learnable and does not require any programming knowledge. It makes the use of the robot as easy as handling a smartphone. This opens up a wide range of new application perspectives from which small and medium-sized businesses can also benefit. The application possibilities indeed extend far beyond industrial production. In the Center of Excellence, for example, the robot is used for the training of young people in automation technology.
The deployed RFID scanners capture every small load carrier and every product that is within the work area. Passive RFID Tags that can be read wirelessly are affixed to the respective products or load carriers. This information can be used to directly determine and read pre-processes, process durations, post-processes and related background information. RFID technology offers significant advantages over other methods for the identification and localization of objects. For example, barcodes can only be used to read data. For RFID, this only applies to the commonly called read-only transponders. In addition, however, there are transponders that can be defined simply or multiple times. Multi-writable transponders are more expensive than simple-write tags, but they offer more possibilities for use. In addition, some tags allow for the storage of larger amounts of data.
The RFID technology used in the Center of Excellence for Lean Enterprise 4.0 operates in the frequency range 860 MHz – 960 MHz (UHF), using passive transponders. Depending on the configurable (readable) performance of the RFID reader, the tags can be read and defined over a distance of up to 1.5 m. The implementation of RFID technology makes it possible to uniquely identify each component and locate it in a specific workstation. In addition to the real-time visualization of key figures, this ensures an uninterrupted traceability of every component.
The Real-Time Location System (RTLS) locates individual active tags in the three-dimensional space of the Center of Excellence with the aid of firmly installed sensors. The tags send an ultra-wideband signal at a specified frequency. In an associated software, they are coupled to virtual CAD objects. A digital shadow of the real production environment is created from this. The positions of workstations, resources and products can therefore be tracked and determined at any time. The aggregated view of past data also enables the visualization of running paths and routes thus revealing waste and potential for improvement. Moreover, search times can be reduced.
Ultra-wideband (UWB) sensors located in the corners of the room are utilized to locate the RTLS tags within the space.
The Microsoft HoloLens is a mixed reality and augmented reality application that allows the user to view 3D projections in their direct environment by engaging with a 3D head-mounted display, sensors, loudspeakers, an independent central processing unit (CPU) as well as a graphics processing unit (GPU). As such, it is suggested to the user that virtual objects are in their real environment. Equally so, the user can interact with these objects through speech and gesture control. For example, these objects can be moved or their size can be adapted.
In the Center of Excellence, this technology is used for a Lean Service 4.0 application situation. With this, a service technician is made aware of the maintenance requirements of the machines in the Center of Excellence using the IFU Lean Service 4.0 application for the Microsoft HoloLens. In addition, for different maintenance cases, all necessary information for different maintenance cases is displayed in the user's field of view using images, videos and geopoints. Thus, the service technician can work with both hands while always having all of the important information in view.
Using 3D printing technology, real, functional prototypes as well as spare parts can be produced from CAD data in a short amount of time. These prototypes can be used in the early planning phase to test fit accuracy and function. Through the high stability as well as the simple components, this method can be directly used for the manufacture of series production. The Center of Excellence uses a 3D printer from the company, Stratasys, which uses meltable plastics for the production of the respective prototypes or spare parts.