Greg Notess, Faculty and Graduate Student Librarian, Montana State University (now retired) and Author, Search Engine Showdown, discussed the changing nature of search. We are seeing a change in how we start and run searches. We used to have text frequency; now we have our devices with us all the time, and they are tracking where we are and what we are looking for. With the rise of graphical and audio abilities, we can search in very different ways.
Image searching: We have had image capabilities for a long time, and now we can start a search by talking to our device or looking for images. A lot of searching is for shopping; Tin Eye is a reverse image search system to find images on line. Image matching allows one to use an image and find other places where it has been used. Google Photos is for small devices, and Google Assistant is available in many places. AR and AI are being incorporated into image matching capabilities. Google Lens can extract text from image and use it to run a search (for example, extract an email address from a business card and add it to your contacts or choose something from an image to shop for).
Smart speakers (Alexa, Echo, etc.) will have a large effect. Apple’s HomePod claims to have better sound. Baidu (the Chinese search system) has several advanced capabilities. Using the popIn Aladdin, in the library world, you can read a book.
AR, AI, and the Future: AI is everywhere in search.Google knows where you are from your phone and can then give you results from a search. AR allows viewing of products in context,
The usability varies: see https://www.youtube.com/watch?v=YvT_gqs5ETk.