Have you ever wished you could see things nearby better than your eyes? Maybe you want to read small print on a label, a menu or a book.
Or maybe you just want to have some fun with your iPhone and discover new details in your surroundings.
Whatever the reason, there is a cool feature on your iPhone that can help you see things easier and even tell you what it’s looking at.
It’s called the point-and-speak feature, and it’s exactly that: Point your iPhone at something to read and it will speak it to you.
What is point-and-speak?
The point-and-speak feature is part of the magnifier app, which is a built-in accessibility tool for people with visual impairments. But anyone can use it, whether you have low vision or not.
The Magnifier app turns your iPhone into a digital magnifying glass that can zoom in and out, adjust brightness and contrast, and apply filters to enhance the image. The point-and-speak feature adds another layer of functionality: It can recognize text in the image and read it aloud using Siri’s voice.
Requirements to use the point-and-speak feature
To use the point-and-speak feature, you need to have iOS 17 or later installed on your iPhone, and your iPhone model must have a LiDAR sensor. The LiDAR sensor is a special camera that can detect depth and distance. You can check if your iPhone has a LiDAR sensor by looking at the back of your iPhone. If you see a small black circle next to the main camera, that means your iPhone has a LiDAR sensor. The iPhone models that have a LiDAR sensor are:
MORE: IPHONE 15 PRO’S BEST NEW SECRET WEAPON. HOW TO USE THE ACTION BUTTON
If you don’t already have the Magnifier app on your iPhone
How to use the point-and-speak feature
If your iPhone has a LiDAR sensor and iOS 17 or later, you can use the point-and-speak feature by following these steps:
TEXAS JUDGE BACKS STATE’S TIKTOK BAN ON STATE-OWNED DEVICES
MORE: HOW TO FIND ANY RECIPE WITH JUST A PHOTO ON IPHONE
How to receive live image descriptions
There is also a feature that can help you know what your iPhone is seeing. It’s called live image descriptions, and it can tell you what objects, people and text are in your camera view. Live image descriptions are also part of the Magnifier app. Here’s how to turn on live image descriptions and use them:
HOW 2 OF BIGGEST TECH COMPANIES ARE SECRETLY HELPING GOVERNMENTS SPY ON YOUR SMARTPHONE
Live Image Descriptions are a useful and fun feature that can help you explore and learn more about your environment. Try them out and see what your iPhone can see.
MORE: UNFORGETTABLE TRICKS TO CONTROL YOUR IPHONE WITH VOICE COMMANDS AND TOUCH
Kurt’s key takeaways
The point-and-speak and the Live Image Descriptions features are amazing tools that can help you see and hear things better using your iPhone. They are not only useful for people with visual impairments but also for anyone who wants to have some fun and discover new things in their surroundings.
Whether you want to read a small print, a sign, or a book, or you want to know what objects, people, and text are in your camera view, you can use these features to make your iPhone more than just a phone. Try them out and see what your iPhone can do for you.
What are some situations where you would use the point-and-speak or the Live Image Descriptions feature? Let us know by writing us at Cyberguy.com/Contact.
For more of my tech tips & security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/Newsletter.
Ask Kurt a question or let us know what stories you’d like us to cover.
Answers to the most asked CyberGuy questions:
CyberGuy Best Holiday Gift Guide
Copyright 2023 CyberGuy.com. All rights reserved.
Technology News Articles on Fox News