{"id":90397,"date":"2019-11-07T04:04:15","date_gmt":"2019-11-07T12:04:15","guid":{"rendered":"https:\/\/www.intego.com\/mac-security-blog\/?p=90397"},"modified":"2021-11-03T14:03:24","modified_gmt":"2021-11-03T21:03:24","slug":"researchers-use-lasers-to-hack-siri-alexa-google-assistants","status":"publish","type":"post","link":"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/","title":{"rendered":"Researchers use lasers to hack Siri, Alexa, Google assistants"},"content":{"rendered":"<p><img loading=\"lazy\" class=\"aligncenter size-full wp-image-90421\" src=\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2019\/11\/Light-Commands-attack-scares-Apple-HomePod-with-emoji-eyes-600x300.png\" alt=\"\" width=\"600\" height=\"300\" srcset=\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2019\/11\/Light-Commands-attack-scares-Apple-HomePod-with-emoji-eyes-600x300.png 600w, https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2019\/11\/Light-Commands-attack-scares-Apple-HomePod-with-emoji-eyes-600x300-150x75.png 150w, https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2019\/11\/Light-Commands-attack-scares-Apple-HomePod-with-emoji-eyes-600x300-300x150.png 300w\" sizes=\"(max-width: 600px) 100vw, 600px\" \/>Researchers have used programmable low-powered lasers to remotely control voice-activated personal assistants like Apple&#8217;s Siri, Amazon&#8217;s Alexa, and Google Assistant from up to 360 feet away. These attacks exploit a vulnerability in microphones using micro-electro-mechanical systems (MEMS), which the researchers have discovered will respond to lasers the same way they will respond to sound.<\/p>\n<p>The attack method, called Light Commands, was only tested on the three big-name players in the personal assistant space, but the researchers believe this attack will affect any microphone that uses MEMS (presumably including devices with Microsoft Cortana, Baidu DuerOS, or other digital assistants).<\/p>\n<p>Although the attack is essentially a proof-of-concept at this stage, and has significant limitations (like requiring direct line-of-sight to the device), the researchers admit that they don\u2019t fully understand why this exploit works, which could open the door to others finding ways to make it even more effective.<\/p>\n<p>Voice-activated systems vary in the level of control they allow over a device without first gaining user authentication. Apple\u2019s track record with Siri is mostly decent in this sphere, with most critical voice-activated functions requiring a passcode or biometric verification (i.e. Touch ID or Face ID) before completing the request.<\/p>\n<p>In some circumstances, the Light Commands attack method can be used to brute-force a device passcode, so the usual security advice applies: choose a complex and not easily-guessable passcode, and set up your device to lock after a certain number of incorrect attempts. Even that basic level of security should protect you against this new and admittedly sci-fi sounding method of remote device hacking.<\/p>\n<p>You can find out more about the Light Commands attack at <a href=\"https:\/\/arstechnica.com\/information-technology\/2019\/11\/researchers-hack-siri-alexa-and-google-home-by-shining-lasers-at-them\/\" target=\"_blank\" rel=\"noopener\">Ars Technica<\/a>\u00a0or at the <a href=\"https:\/\/lightcommands.com\/\" target=\"_blank\" rel=\"noopener\">Light Commands<\/a> homepage.<\/p>\n<p><em>Related:<\/em><\/p>\n<p>Researchers came up with a similar attack called <a href=\"https:\/\/www.intego.com\/mac-security-blog\/month-in-review-apple-security-in-september-2017\/#DolphinAttack\">DolphinAttack<\/a> in 2017, where inaudibly high-pitched voice commands were used to control an iPhone.<\/p>\n<div id=\"attachment_70687\" style=\"width: 1034px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/www.intego.com\/mac-security-blog\/month-in-review-apple-security-in-september-2017\/#DolphinAttack\"><img aria-describedby=\"caption-attachment-70687\" loading=\"lazy\" class=\"wp-image-70687 size-large\" src=\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2017\/10\/DolphinAttack-demo-1024x576.png\" alt=\"\" width=\"1024\" height=\"576\" srcset=\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2017\/10\/DolphinAttack-demo-1024x576.png 1024w, https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2017\/10\/DolphinAttack-demo-150x84.png 150w, https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2017\/10\/DolphinAttack-demo-300x169.png 300w, https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2017\/10\/DolphinAttack-demo-768x432.png 768w, https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2017\/10\/DolphinAttack-demo-657x369.png 657w, https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2017\/10\/DolphinAttack-demo.png 1708w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/a><p id=\"caption-attachment-70687\" class=\"wp-caption-text\"><a href=\"https:\/\/www.intego.com\/mac-security-blog\/month-in-review-apple-security-in-september-2017\/#DolphinAttack\">DolphinAttack<\/a> is a method of sending a Hey Siri voice command in such a high pitch that humans cannot hear it. Image: Guoming Zhang\u00a0via\u00a0<a href=\"https:\/\/www.youtube.com\/watch?v=21HjF4A3WE4\" target=\"_blank\" rel=\"noopener\">YouTube<\/a>.<\/p><\/div>\n<h3>How can I learn more?<\/h3>\n<p><a href=\"https:\/\/itunes.apple.com\/us\/podcast\/intego-mac-podcast\/id1293834627\" target=\"_blank\" rel=\"noopener\"><img loading=\"lazy\" class=\"alignright size-thumbnail wp-image-71818\" src=\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2017\/10\/ios9-podcasts-app-tile-150x150.png\" alt=\"\" width=\"50\" height=\"50\" srcset=\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2017\/10\/ios9-podcasts-app-tile-150x150.png 150w, https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2017\/10\/ios9-podcasts-app-tile-32x32.png 32w, https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2017\/10\/ios9-podcasts-app-tile-50x50.png 50w, https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2017\/10\/ios9-podcasts-app-tile-64x64.png 64w, https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2017\/10\/ios9-podcasts-app-tile-96x96.png 96w, https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2017\/10\/ios9-podcasts-app-tile-128x128.png 128w, https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2017\/10\/ios9-podcasts-app-tile.png 300w\" sizes=\"(max-width: 50px) 100vw, 50px\" \/><\/a>This week on the <strong>Intego Mac Podcast<\/strong>\u00a0<a href=\"https:\/\/podcast.intego.com\/108\">episode 108<\/a>, Intego&#8217;s experts will discuss the new Light Commands attack, as well as <a href=\"https:\/\/www.intego.com\/mac-security-blog\/apple-updates-its-privacy-page-with-sleek-new-look\/\" rel=\"noopener\">Apple&#8217;s privacy policy page update<\/a>, and <a href=\"https:\/\/www.intego.com\/mac-security-blog\/ipad-vs-macbook-is-ipados-a-game-changer\/\" rel=\"noopener\">whether an iPad can replace your MacBook<\/a>. Be sure to <a href=\"https:\/\/podcasts.apple.com\/us\/podcast\/intego-mac-podcast\/id1293834627\" target=\"_blank\" rel=\"noopener\">follow the podcast<\/a> to make sure you never miss the latest episode.<\/p>\n<p><iframe loading=\"lazy\" src=\"https:\/\/player.fireside.fm\/v2\/GegHgcrH+nO6qEr3R?theme=dark\" width=\"740\" height=\"200\" frameborder=\"0\" scrolling=\"no\"><\/iframe><\/p>\n<p>You can also subscribe to our <a href=\"https:\/\/www.intego.com\/mac-security-blog\/mac-security-newsletter\/\"><strong>e-mail newsletter<\/strong><\/a> and keep an eye here on <a href=\"https:\/\/www.intego.com\/mac-security-blog\"><strong>The Mac Security Blog<\/strong><\/a> for the latest Apple security and privacy news. And don&#8217;t forget to follow Intego on your favorite social media channels: <a href=\"https:\/\/twitter.com\/IntegoSecurity\" target=\"_blank\" rel=\"noopener\"><img style=\"border-width: 1px; border-style: solid; border-color: rgba(255, 255, 255, 0.2); border-radius: 8px;\" title=\"Follow Intego on Twitter\" src=\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2021\/10\/Twitter-logo-icon-64.png\" alt=\"Follow Intego on Twitter\" width=\"16\" \/><\/a>\u00a0<a href=\"https:\/\/www.facebook.com\/Intego\" target=\"_blank\" rel=\"noopener\"><img style=\"border-width: 1px; border-style: solid; border-color: rgba(255, 255, 255, 0.2); border-radius: 8px;\" title=\"Follow Intego on Facebook\" src=\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2021\/10\/Facebook-logo-icon-64.png\" alt=\"Follow Intego on Facebook\" width=\"16\" \/><\/a>\u00a0<a href=\"https:\/\/www.youtube.com\/user\/IntegoVideo?sub_confirmation=1\" target=\"_blank\" rel=\"noopener\"><img style=\"border-width: 1px; border-style: solid; border-color: rgba(0, 0, 0, 0.2); border-radius: 8px;\" title=\"Follow Intego on YouTube\" src=\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2021\/10\/YouTube-logo-icon-64.png\" alt=\"Follow Intego on YouTube\" width=\"16\" \/><\/a>\u00a0<a href=\"https:\/\/www.pinterest.com\/intego\/\" target=\"_blank\" rel=\"noopener\"><img style=\"border-width: 1px; border-style: solid; border-color: rgba(0, 0, 0, 0.2); border-radius: 8px;\" title=\"Follow Intego on Pinterest\" src=\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2021\/10\/Pinterest-logo-icon-64.png\" alt=\"Follow Intego on Pinterest\" width=\"16\" \/><\/a>\u00a0<a href=\"https:\/\/www.linkedin.com\/company\/intego\" target=\"_blank\" rel=\"noopener\"><img style=\"border-width: 1px; border-style: solid; border-color: rgba(255, 255, 255, 0.2); border-radius: 8px;\" title=\"Follow Intego on LinkedIn\" src=\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2021\/10\/LinkedIn-logo-icon-64.png\" alt=\"Follow Intego on LinkedIn\" width=\"16\" \/><\/a>\u00a0<a href=\"https:\/\/www.instagram.com\/intego_security\/\" target=\"_blank\" rel=\"noopener\"><img style=\"border-width: 1px; border-style: solid; border-color: rgba(255, 255, 255, 0.2); border-radius: 8px;\" title=\"Follow Intego on Instagram\" src=\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2021\/10\/Instagram-logo-icon-64.png\" alt=\"Follow Intego on Instagram\" width=\"16\" \/><\/a>\u00a0<a href=\"https:\/\/podcasts.apple.com\/us\/podcast\/intego-mac-podcast\/id1293834627\" target=\"_blank\" rel=\"noopener\"><img style=\"border-width: 1px; border-style: solid; border-color: rgba(255, 255, 255, 0.2); border-radius: 8px;\" title=\"Follow the Intego Mac Podcast on Apple Podcasts\" src=\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2017\/10\/ios9-podcasts-app-tile.png\" alt=\"Follow the Intego Mac Podcast on Apple Podcasts\" width=\"16\" \/><\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Researchers have used programmable low-powered lasers to remotely control voice-activated personal assistants like Apple&#8217;s Siri, Amazon&#8217;s Alexa, and Google Assistant from up to 360 feet away. These attacks exploit a vulnerability in microphones using micro-electro-mechanical systems (MEMS), which the researchers have discovered will respond to lasers the same way they will respond to sound. The [&hellip;]<\/p>\n","protected":false},"author":100,"featured_media":90424,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"spay_email":"","jetpack_publicize_message":"","jetpack_is_tweetstorm":false},"categories":[13],"tags":[4009,4000,4006,4003,4219,3925,645],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v17.4 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<meta name=\"description\" content=\"Researchers have used programmable low-powered lasers to remotely control voice-activated personal assistants like Apple&#039;s Siri, Amazon&#039;s Alexa, and\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Researchers use lasers to hack Siri, Alexa, Google assistants - The Mac Security Blog\" \/>\n<meta property=\"og:description\" content=\"Researchers have used programmable low-powered lasers to remotely control voice-activated personal assistants like Apple&#039;s Siri, Amazon&#039;s Alexa, and\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/\" \/>\n<meta property=\"og:site_name\" content=\"The Mac Security Blog\" \/>\n<meta property=\"article:published_time\" content=\"2019-11-07T12:04:15+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2021-11-03T21:03:24+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2019\/11\/Light-Commands-attack-scares-Apple-HomePod-with-emoji-eyes-400x260.png\" \/>\n\t<meta property=\"og:image:width\" content=\"400\" \/>\n\t<meta property=\"og:image:height\" content=\"260\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Chris Rawson\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"4 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/#organization\",\"name\":\"Intego\",\"url\":\"https:\/\/www.intego.com\/mac-security-blog\/\",\"sameAs\":[],\"logo\":{\"@type\":\"ImageObject\",\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/#logo\",\"inLanguage\":\"en-US\",\"url\":\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2022\/10\/intego-organization-logo-for-google-knowledge-graph-875x875-1.png\",\"contentUrl\":\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2022\/10\/intego-organization-logo-for-google-knowledge-graph-875x875-1.png\",\"width\":875,\"height\":875,\"caption\":\"Intego\"},\"image\":{\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/#logo\"}},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/#website\",\"url\":\"https:\/\/www.intego.com\/mac-security-blog\/\",\"name\":\"The Mac Security Blog\",\"description\":\"Keep Macs safe from the dangers of the Internet\",\"publisher\":{\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.intego.com\/mac-security-blog\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"},{\"@type\":\"ImageObject\",\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#primaryimage\",\"inLanguage\":\"en-US\",\"url\":\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2019\/11\/Light-Commands-attack-scares-Apple-HomePod-with-emoji-eyes-400x260.png\",\"contentUrl\":\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2019\/11\/Light-Commands-attack-scares-Apple-HomePod-with-emoji-eyes-400x260.png\",\"width\":400,\"height\":260,\"caption\":\"Light Commands attack scares Apple HomePod with emoji eyes\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#webpage\",\"url\":\"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/\",\"name\":\"Researchers use lasers to hack Siri, Alexa, Google assistants - The Mac Security Blog\",\"isPartOf\":{\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#primaryimage\"},\"datePublished\":\"2019-11-07T12:04:15+00:00\",\"dateModified\":\"2021-11-03T21:03:24+00:00\",\"description\":\"Researchers have used programmable low-powered lasers to remotely control voice-activated personal assistants like Apple's Siri, Amazon's Alexa, and\",\"breadcrumb\":{\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.intego.com\/mac-security-blog\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Researchers use lasers to hack Siri, Alexa, Google assistants\"}]},{\"@type\":\"Article\",\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#webpage\"},\"author\":{\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/#\/schema\/person\/4d85c65ddd0561c135d03668332d7a1d\"},\"headline\":\"Researchers use lasers to hack Siri, Alexa, Google assistants\",\"datePublished\":\"2019-11-07T12:04:15+00:00\",\"dateModified\":\"2021-11-03T21:03:24+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#webpage\"},\"wordCount\":436,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/#organization\"},\"image\":{\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2019\/11\/Light-Commands-attack-scares-Apple-HomePod-with-emoji-eyes-400x260.png\",\"keywords\":[\"Alexa\",\"Amazon Echo\",\"Apple HomePod\",\"Google Home\",\"Hey Siri\",\"HomePod\",\"Siri\"],\"articleSection\":[\"Security &amp; Privacy\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#respond\"]}]},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/#\/schema\/person\/4d85c65ddd0561c135d03668332d7a1d\",\"name\":\"Chris Rawson\",\"image\":{\"@type\":\"ImageObject\",\"@id\":\"https:\/\/www.intego.com\/mac-security-blog\/#personlogo\",\"inLanguage\":\"en-US\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/1fbdb7a0edb62e962859aa253d8e9b7e?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/1fbdb7a0edb62e962859aa253d8e9b7e?s=96&d=mm&r=g\",\"caption\":\"Chris Rawson\"},\"url\":\"https:\/\/www.intego.com\/mac-security-blog\/author\/chris-rawson\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"description":"Researchers have used programmable low-powered lasers to remotely control voice-activated personal assistants like Apple's Siri, Amazon's Alexa, and","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/","og_locale":"en_US","og_type":"article","og_title":"Researchers use lasers to hack Siri, Alexa, Google assistants - The Mac Security Blog","og_description":"Researchers have used programmable low-powered lasers to remotely control voice-activated personal assistants like Apple's Siri, Amazon's Alexa, and","og_url":"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/","og_site_name":"The Mac Security Blog","article_published_time":"2019-11-07T12:04:15+00:00","article_modified_time":"2021-11-03T21:03:24+00:00","og_image":[{"width":400,"height":260,"url":"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2019\/11\/Light-Commands-attack-scares-Apple-HomePod-with-emoji-eyes-400x260.png","type":"image\/png"}],"twitter_card":"summary_large_image","twitter_misc":{"Written by":"Chris Rawson","Est. reading time":"4 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Organization","@id":"https:\/\/www.intego.com\/mac-security-blog\/#organization","name":"Intego","url":"https:\/\/www.intego.com\/mac-security-blog\/","sameAs":[],"logo":{"@type":"ImageObject","@id":"https:\/\/www.intego.com\/mac-security-blog\/#logo","inLanguage":"en-US","url":"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2022\/10\/intego-organization-logo-for-google-knowledge-graph-875x875-1.png","contentUrl":"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2022\/10\/intego-organization-logo-for-google-knowledge-graph-875x875-1.png","width":875,"height":875,"caption":"Intego"},"image":{"@id":"https:\/\/www.intego.com\/mac-security-blog\/#logo"}},{"@type":"WebSite","@id":"https:\/\/www.intego.com\/mac-security-blog\/#website","url":"https:\/\/www.intego.com\/mac-security-blog\/","name":"The Mac Security Blog","description":"Keep Macs safe from the dangers of the Internet","publisher":{"@id":"https:\/\/www.intego.com\/mac-security-blog\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.intego.com\/mac-security-blog\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-US"},{"@type":"ImageObject","@id":"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#primaryimage","inLanguage":"en-US","url":"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2019\/11\/Light-Commands-attack-scares-Apple-HomePod-with-emoji-eyes-400x260.png","contentUrl":"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2019\/11\/Light-Commands-attack-scares-Apple-HomePod-with-emoji-eyes-400x260.png","width":400,"height":260,"caption":"Light Commands attack scares Apple HomePod with emoji eyes"},{"@type":"WebPage","@id":"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#webpage","url":"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/","name":"Researchers use lasers to hack Siri, Alexa, Google assistants - The Mac Security Blog","isPartOf":{"@id":"https:\/\/www.intego.com\/mac-security-blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#primaryimage"},"datePublished":"2019-11-07T12:04:15+00:00","dateModified":"2021-11-03T21:03:24+00:00","description":"Researchers have used programmable low-powered lasers to remotely control voice-activated personal assistants like Apple's Siri, Amazon's Alexa, and","breadcrumb":{"@id":"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.intego.com\/mac-security-blog\/"},{"@type":"ListItem","position":2,"name":"Researchers use lasers to hack Siri, Alexa, Google assistants"}]},{"@type":"Article","@id":"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#article","isPartOf":{"@id":"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#webpage"},"author":{"@id":"https:\/\/www.intego.com\/mac-security-blog\/#\/schema\/person\/4d85c65ddd0561c135d03668332d7a1d"},"headline":"Researchers use lasers to hack Siri, Alexa, Google assistants","datePublished":"2019-11-07T12:04:15+00:00","dateModified":"2021-11-03T21:03:24+00:00","mainEntityOfPage":{"@id":"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#webpage"},"wordCount":436,"commentCount":0,"publisher":{"@id":"https:\/\/www.intego.com\/mac-security-blog\/#organization"},"image":{"@id":"https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#primaryimage"},"thumbnailUrl":"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2019\/11\/Light-Commands-attack-scares-Apple-HomePod-with-emoji-eyes-400x260.png","keywords":["Alexa","Amazon Echo","Apple HomePod","Google Home","Hey Siri","HomePod","Siri"],"articleSection":["Security &amp; Privacy"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.intego.com\/mac-security-blog\/researchers-use-lasers-to-hack-siri-alexa-google-assistants\/#respond"]}]},{"@type":"Person","@id":"https:\/\/www.intego.com\/mac-security-blog\/#\/schema\/person\/4d85c65ddd0561c135d03668332d7a1d","name":"Chris Rawson","image":{"@type":"ImageObject","@id":"https:\/\/www.intego.com\/mac-security-blog\/#personlogo","inLanguage":"en-US","url":"https:\/\/secure.gravatar.com\/avatar\/1fbdb7a0edb62e962859aa253d8e9b7e?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/1fbdb7a0edb62e962859aa253d8e9b7e?s=96&d=mm&r=g","caption":"Chris Rawson"},"url":"https:\/\/www.intego.com\/mac-security-blog\/author\/chris-rawson\/"}]}},"jetpack_featured_media_url":"https:\/\/www.intego.com\/mac-security-blog\/wp-content\/uploads\/2019\/11\/Light-Commands-attack-scares-Apple-HomePod-with-emoji-eyes-400x260.png","jetpack_publicize_connections":[],"jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p4VAYd-nw1","amp_enabled":true,"_links":{"self":[{"href":"https:\/\/origin.intego.com\/mac-security-blog\/wp-json\/wp\/v2\/posts\/90397"}],"collection":[{"href":"https:\/\/origin.intego.com\/mac-security-blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/origin.intego.com\/mac-security-blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/origin.intego.com\/mac-security-blog\/wp-json\/wp\/v2\/users\/100"}],"replies":[{"embeddable":true,"href":"https:\/\/origin.intego.com\/mac-security-blog\/wp-json\/wp\/v2\/comments?post=90397"}],"version-history":[{"count":9,"href":"https:\/\/origin.intego.com\/mac-security-blog\/wp-json\/wp\/v2\/posts\/90397\/revisions"}],"predecessor-version":[{"id":94932,"href":"https:\/\/origin.intego.com\/mac-security-blog\/wp-json\/wp\/v2\/posts\/90397\/revisions\/94932"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/origin.intego.com\/mac-security-blog\/wp-json\/wp\/v2\/media\/90424"}],"wp:attachment":[{"href":"https:\/\/origin.intego.com\/mac-security-blog\/wp-json\/wp\/v2\/media?parent=90397"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/origin.intego.com\/mac-security-blog\/wp-json\/wp\/v2\/categories?post=90397"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/origin.intego.com\/mac-security-blog\/wp-json\/wp\/v2\/tags?post=90397"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}