1. 2017. BMW’s 7-Series ’gesture controls’ work pretty well. (2017). https://eu.usatoday.com/story/money/cars/2016/05/16/ bmws- 7- series- gesture- controls- work- pretty- well/ 32613369/
2. 2017. Explore Golf R : Volkswagen UK. (2017).
http://www.volkswagen.co.uk/new/golf- vii- pa/explore/r
3. 2017. Look out for Ultrahaptics haptic feedback in new cars this year | TechCrunch. (2017).
4. 2018. Car functions now controlled by waving a hand. (2018). https://eu.usatoday.com/story/money/cars/2013/ 01/10/cartech- gestures- ces/1820453/
5. 2018. Ultrahaptics - Discover a new type of haptics. (2018). https://www.ultrahaptics.com/
6. Micah Alpern and Katie Minardo. 2003. Developing a car gesture interface for use as a secondary task. CHI EA (2003), 932. DOI: http://dx.doi.org/10.1145/766077.766078
7. Tom Carter, Sue Ann Seah, Benjamin Long, Bruce Drinkwater, and Sriram Subramanian. 2013. UltraHaptics: Multi-Point Mid-Air Haptic Feedback for Touch Surfaces. UIST (2013), 505–514. DOI: http://dx.doi.org/10.1145/2501988.2502018
8. Patricia Ivette Cornelio Martinez, Silvana De Pirro,
Chi Thanh Vi, and Sriram Subramanian. 2017. Agency in Mid-air Interfaces. In CHI. DOI: http://dx.doi.org/10.1145/3025453.3025457
9. Nelson Cowan. 2001. The magical number 4 in short term memory. A reconsideration of storage capacity. In Behavioral and Brain Sciences, Vol. 24. 87–186. DOI: http://dx.doi.org/10.1017/S0140525X01003922
10. Birsen Donmez, Linda Ng Boyle, and John D Lee. 2010. Differences in Off-Road Glances: Effects on Young Drivers’ Performance. Journal of Transportation and Engineering 136, 5 (2010), 403–409. DOI: http://dx.doi.org/10.1061/(ASCE)TE.1943- 5436.0000068
11. Capt Eileen Ancman. 1991. Peripherally Located CRTs: Color Perception Limitations. In NAECON. DOI: http://dx.doi.org/10.1109/NAECON.1991.165871
12. Euan Freeman, Stephen Brewster, and Vuokko Lantz. 2014a. Illuminating Gesture Interfaces with Interactive Light Feedback Abstract. NordiCHI (2014). DOI: http://dx.doi.org/10.1007/978- 3- 319- 22723- 8{_}42
13. Euan Freeman, Stephen Brewster, and Vuokko Lantz. 2014b. Tactile Feedback for Above-Device Gesture Interfaces: Adding Touch to Touchless Interactions. ICMI (2014), to appear. DOI: http://dx.doi.org/10.1145/2663204.2663280
14. Euan Freeman, Stephen Brewster, and Vuokko Lantz. 2015. Interactive light feedback: Illuminating Above-Device gesture interfaces. In Lecture Notes in Computer Science (including subseries Lecture Notes in
Artificial Intelligence and Lecture Notes in Bioinformatics), Vol. 9299. 478–481. DOI: http://dx.doi.org/10.1007/978-3-319-22723-8{_}42
15. Euan Freeman, Stephen Brewster, and Vuokko Lantz. 2016. Do That , There : An Interaction Technique for Addressing In-Air Gesture Systems. In CHI. ACM, 2319–2331. DOI: http://dx.doi.org/10.1145/2858036.2858308
16. Thomas M Gable, R May May, and Bruce N Walker. 2014. Applying Popular Usability Heuristics to Gesture Interaction in the Vehicle. In AutoUI. 1–7. DOI: http://dx.doi.org/10.1145/2667239.2667298
17. Orestis Georgiou. 2017. Haptic In-Vehicle Gesture Controls. In AutoUI. Oldenburg, Germany. DOI: http://dx.doi.org/10.1145/3131726.3132045
18. Paul Green. 1999. The 15-second rule for driver information systems. Proc of the ITS America Cd (1999), 1–9. http://www.umich.edu/~driving/publications/
ITSA- Green1999.pdf
19. Paul Green. 2000. Crashes Induced by Driver Information Systems and What Can Be Done to Reduce Them. Society of Automotive Engineers Conference Proceedings (2000), 27–36.
20. Paul Green. 2004. Driver distraction, telematics design, and workload managers : safety issues and solutions. Proc. Int. Congr. Transp. Electron (2004), 165 –180. DOI: http://dx.doi.org/10.1167/3.9.157
21. Juan David Hincapié-ramos, Xiang Guo, Paymahn Moghadasian, and Pourang Irani. 2014. Consumed Endurance : A Metric to Quantify Arm Fatigue of Mid - Air Interactions. (2014), 1063–1072.
22. Cristy Ho, Hong Z. Tan, and Charles Spence. 2005. Using spatial vibrotactile cues to direct visual attention in driving scenes. Transportation Research Part F: Traffic Psychology and Behaviour 8, 6 (2005), 397–412. DOI: http://dx.doi.org/10.1016/j.trf.2005.05.002
23. John Jonides. 1981. Voluntary Versus Automatic Control Over the Mind’S Eye’S Movement. Attention and performance IX 9 (1981), 187–203.
24. Raine Kajastila and Tapio Lokki. 2013. Eyes-free interaction with free-hand gestures and auditory menus. International Journal of Human Computer Studies 71, 5 (2013), 627–640. DOI: http://dx.doi.org/10.1016/j.ijhcs.2012.11.003
25. S.G. Klauer, T. A. Dingus, V. L. Neale, J.D. Sudweeks, and D.J. Ramsey. 2006. The Impact of Driver Inattention On Near-Crash/Crash Risk: An Analysis Using the 100-Car Naturalistic Driving Study Data. National Highway Traffic Safety Administration (2006), 226. http://hdl.handle.net/10919/55090
26. Ju-Hwan Lee and Charles Spence. 2008. Assessing the benefits of multimodal feedback on dual-task performance under demanding conditions. In British HCI.
27. Andreas Löcken, Wilko Heuten, and Susanne Boll. 2016. AutoAmbiCar: Using Ambient Light to Inform Drivers About Intentions of Their Automated Cars Motivation. In AutoUI. DOI:http://dx.doi.org/10.1145/3004323.3004329
28. Andreas Löcken, Heiko Müller, Wilko Heuten, and Susanne Boll. 2013. AmbiCar: Towards an in-vehicle ambient light display. In AutoUI.
29. Andreas Locken, Heiko Muller, Wilko Heuten, and Susanne Boll. 2015. An experiment on ambient light patterns to support lane change decisions. In IEEE Intelligent Vehicles Symposium. DOI: http://dx.doi.org/10.1109/IVS.2015.7225735
30. Sebastian Loehmann, Martin Knobel, Melanie Lamara, and Andreas Butz. 2013. Culturally independent gestures for in-car interactions. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 8119 LNCS, PART 3 (2013), 538–545. DOI: http://dx.doi.org/10.1007/978- 3- 642- 40477- 1{_}34
31. Benjamin Long, Sue Ann Seah, Tom Carter, and Sriram Subramanian. 2014. Rendering volumetric haptic shapes in mid-air using ultrasound. ACM Transactions on Graphics (2014). DOI: http://dx.doi.org/10.1145/2661229.2661257
32. Keenan R May, Thomas M Gable, and Bruce N Walker. 2014. A Multimodal Air Gesture Interface for In Vehicle Menu Navigation. In AutoUI. 1–6. DOI: http://dx.doi.org/10.1145/2667239.2667280
33. S Morrison-Smith and J Ruiz. 2014. Using Audio Cues to Support Motion Gesture Interaction on Mobile Devices. CHI’14 Extended Abstracts on Human Factors . . . (2014), 1621–1626. DOI: http://dx.doi.org/10.1145/2559206.2581236
34. Mark I. Nikolic and Nadine B. Sarter. 2001. Peripheral visual feedback: a powerful means of supporting effective attention allocation in event-driven, data-rich environments. Human factors 43, 1 (2001), 30–38. DOI: http://dx.doi.org/10.1518/001872001775992525
35. Yiyun Peng, Linda Ng Boyle, and Shauna L. Hallmark. 2013. Driver’s lane keeping ability with eyes off road: Insights from a naturalistic study. In Accident Analysis and Prevention. DOI: http://dx.doi.org/10.1016/j.aap.2012.06.013
36. Matthew J. Pitts, Lee Skrypchuk, Tom Wellings, Alex Attridge, and Mark a. Williams. 2012. Evaluating user response to in-car haptic feedback touchscreens using the lane change test. Advances in Human-Computer Interaction 2012 (2012). DOI: http://dx.doi.org/10.1155/2012/598739
37. Ioannis Politis. The Effects of Modality, Urgency and Message Content on Responses to Multimodal Driver Displays. In AutomotiveUI 2014 Adjunct Proceedings. Seattle, WA, USA, 1–5. DOI: http://dx.doi.org/10.13140/2.1.4592.3842
38. Bryan Reimer, Bruce Mehler, J. Dobres, and J.F. Coughlin. 2013. The Effects of a Production Level "Voice - Command" Interface on Driver Behavior : Summary Findings on Reported Workload , Physiology , Visual Attention , and Driving Performance. Technical Report 17A. Massachusetts Institute of Technology, Cambridge, MA.
39. Andreas Riener, Alois Ferscha, Florian Bachmair, Patrick Hagmüller, Alexander Lemme, Dominik Muttenthaler, David Pühringer, Harald Rogner, Adrian Tappe, and Florian Weger. 2013. Standardization of the in-car gesture interaction space. AutoUI (2013), 14–21. DOI: http://dx.doi.org/10.1145/2516540.2516544
40. Sonja Rümelin, Thomas Gabler, and Jesper Bellenbaum. 2017. Clicks are in the Air: How to Support the Interaction with Floating Objects through Ultrasonic Feedback. In AutoUI. DOI: http://dx.doi.org/10.1145/3122986.3123010
41. Gözel Shakeri, John H Williamson, and Stephen Brewster. 2017. Novel Multimodal Feedback Techniques for In-Car Mid-Air Gesture Interaction. In AutoUI. DOI: http://dx.doi.org/10.1145/3122986.3123011
42. Jason Sterkenburg, Joshua Johnson, Steven Landry, and Myounghoon Jeon. 2016a. Development Tool for Rapid Evaluation of Eyes-free In-vehicle Gesture Controls. In AutoUI. DOI:http://dx.doi.org/10.1145/3004323.3004357
43. Jason Sterkenburg, Steven Landry, Myounghoon Jeon, and Joshua Johnson. 2016b. Towards an in-vehicle sonically-enhanced gesture control interface: a pilot study. In ICAD. 0–3. DOI: http://dx.doi.org/10.21785/icad2016.015
44. Jan Theeuwes. 1991. Exogenous and endogenous control of attention: the effect of visual onsets and offsets. Perception & Psychophysics 49, 1 (1991), 83–90. DOI: http://dx.doi.org/10.3758/BF03211619
45. Jan B F Van Erp and Hendrik a H C Van Veen. 2004. Vibrotactile in-vehicle navigation system. Transportation Research Part F: Traffic Psychology and Behaviour 7, 4-5 (2004), 247–256. DOI: http://dx.doi.org/10.1016/j.trf.2004.09.003
46. Hanneke Hooft van Huysduynen, Jacques Terken, Alexander Meschtscherjakov, Berry Eggen, and Manfred Tscheligi. 2017. Ambient Light and its Influence on Driving Experience. In AutoUI. DOI: http://dx.doi.org/10.1145/3122986.3122992
47. Yu Zhang and Linda Angell. 2014. Pointing Towards Future Automotive HMIs: The Potential for Gesture Interaction. AutoUI 22, 3 (2014), 22–29. DOI: http://dx.doi.org/10.1016/j.apergo.2013.10.013
48. Ilka Zöller, Roman Bechmann, and Bettina Abendroth. 2017. Possible applications for gestures while driving. Automotive and Engine Technology (2017). DOI: http://dx.doi.org/10.1007/s41104- 017- 0023- 7