Voice – Alan Blog https://alan.app/blog/ Alan Blog: follow the most recent Voice AI articles Tue, 23 Jan 2024 07:28:30 +0000 en-US hourly 1 https://i0.wp.com/synqqblog.wpcomstaging.com/wp-content/uploads/2019/10/favicon-32x32.png?fit=32%2C32&ssl=1 Voice – Alan Blog https://alan.app/blog/ 32 32 111528672 The product manager’s guide to intelligent voice interfaces https://alan.app/blog/the-product-managers-guide-to-ai-voice-assistants/ Tue, 26 Oct 2021 09:30:25 +0000 https://alan.app/blog/?p=5004 As product manager, your job is to constantly look for ways to improve your application, delight your customers, and resolve pain points. And in this regard, voice interfaces provide a unique opportunity to secure and expand your app’s position in the market where you compete. Voice assistants are not new. Siri...]]>

As product manager, your job is to constantly look for ways to improve your application, delight your customers, and resolve pain points. And in this regard, voice interfaces provide a unique opportunity to secure and expand your app’s position in the market where you compete.

Voice assistants are not new. Siri is now ten years old. But the voice interface market is nearing a turning point, where advances in artificial intelligence and mobile computing are making them a ubiquitous part of every user’s computing experience.

By giving multimodal interfaces to applications, voice assistants bring the user experience closer to human interactions. They also provide app developers with the opportunity to provide infinite functionality, an especially important factor for small-screen mobile devices and wearables. And from a product management perspective, voice interfaces enable product teams to iterate fast and add new features at a very fast pace.

However, not all voice interfaces are made equal. The first generation of voice interfaces, which made their debut on mobile operating systems and smart speakers, are limited in the scope of benefits they can bring to applications. Absence of cross-platform support, privacy concerns, and lack of contextual awareness make it very difficult to integrate these voice platforms into applications. Otherwise put, they have been created to serve the needs of their vendors, not app developers. 

Meeting these challenges is the vision behind the Alan Platform, an intelligent voice interface that has been created from ground up with product integration in mind. The Alan Platform provides superior natural language processing capabilities thanks to deep integration with your application, which enables it to draw contextual insights from various sources, including voice, interactions with application interface, and business workflows. 

Alan Platform works across all web and mobile operating systems and is easy to integrate with your application, requiring minimal changes to the backend and frontend. Your team doesn’t need to have any experience in machine learning or technical knowledge of AI to integrate the Alan Platform and use it in your application.

The Alan Platform is also a privacy- and security-friendly voice assistant. Every Alan customer gets an independent instance of the Alan Server, where they have exclusive ownership of their data. There is no third-party access to the data, and the server instance lives in a secure cloud that complies with all major enterprise-grade data protection standards.

Finally, Alan AI has been designed for super-fast iteration and infinite functionality support. The Alan Platform comes with a rich analytics tool that enables you to have fine-grained, real-time visibility into how users interact with your voice interface and graphical elements, and how they respond to changes in your application. Alan’s voice is a great source for finding current pain-points, testing hypotheses, and drawing inspiration for new ideas to improve your application.

Please send a note to sales@alan.app to get access to the white paper on “why voice should be part of your 2022 digital roadmap” and find out what the Alan Platform can do for you and how our customers are using it to transform the user experience of their applications.

]]>
5004
Alan AI Voice Interface Competition https://alan.app/blog/alan-ai-video-competition/ https://alan.app/blog/alan-ai-video-competition/#respond Thu, 09 Sep 2021 16:42:42 +0000 https://alan.app/blog/?p=4978 We’ve created a competition that allows you to showcase the voice assistant you’ve created on the Alan Platform.

To be entered into the competition, click on the link here to register. In the meantime, here are some video we’ve created for you to check out:

Hope you enter the competition. Best of luck!

In the meantime, please check out this course we designed for you. If you send in a submission of your project, let us know and we’ll provide a free code for the course.

If you would like to learn more about Alan AI in general or have any questions, please feel free to book a meeting with one of our Customer Success team members here.

]]>
https://alan.app/blog/alan-ai-video-competition/feed/ 0 4978
How Voice Assistants Increase Revenue And Usability of eCommerce Apps https://alan.app/blog/how-voice-assistants-increase-revenue-and-usability-of-e-commerce-apps/ https://alan.app/blog/how-voice-assistants-increase-revenue-and-usability-of-e-commerce-apps/#respond Thu, 29 Oct 2020 11:36:24 +0000 https://alan.app/blog/?p=4197 Voice assistants are a necessity to stay ahead of the competition and deliver a best in class user experience. A voice assistant not only helps you bring down costs, but it also enhances customer satisfaction and improves the performance of your customer support team.]]>

If you’ve been trying hard to boost your revenue and enhance the usability of ecommerce apps, voice assistants are here to help you out.

Before we understand how voice assistants shape the ecommerce industry — what is voice commerce and how is it related to voice assistants?

What is Voice Commerce?

Voice commerce is the act of employing voice recognition technology to enable users to interact with ecommerce websites and applications to search, get support, and purchase products just by using their voice. Voice commerce is growing fast, and is expected to reach 8 billion devices by 2023, and is currently at 1.5 billion devices now, according to Juniper Research. So if you’ve been thinking about adding a voice assistant to your ecommerce store, now is the perfect time. You’re not too late!

What’s more, the general market awareness related to voice technology is particularly high. According to a report by PwC, only 10% of surveyed respondents were unaware of voice-enabled devices and products. On the other hand, 90% of the aware respondents had used a voice assistant. Widespread adoption of voice assistants is being driven by younger consumers and households.

That said, businesses are reaping the benefits from the mainstream adoption of voice assistants in various ways. 

How Voice Assistants Drive Business Outcomes

Business Cost Savings

If you believe that implementing voice assistants in your ecommerce store is going to be a hefty expenditure, you might need to reconsider your views. Yes, you may need to invest a bigger amount upfront, but considering the gains it brings a few years down the line – the amount you are investing is almost nothing. 

As a matter of fact, the return on investment for voice assistants in apps is considerably huge. First, there are low maintenance costs. The easiest route is to go for a third-party stand-alone voice assistant. You need to pay them on a subscription basis, and all the maintenance is their headache. 

Secondly, as voice assistants are going mainstream, they attract better leads and close more sales. Users get to shop even when they are out somewhere – driving or meeting someone. They only need to instruct the voice assistant to place an order for XYZ, and that’s all – no scrolling, browsing, and tapping required. 

This pretty much explains why consumer spending via voice assistants will reach 18% by 2022. 

Higher Customer Satisfaction

Believe us when we say that voice assistants make way for better customer satisfaction. Consumers get personalized attention and real-time responses, just the same way they would if they were to shop in a brick-and-mortar store. All this in the comfort of their home.

A voice assistant reduces the time to buy considerably. According to Bing, searching with your voice is about 3.7 times faster than typing. Google has the same views as well. It revealed that 70% of searches that happen on Google Assistant are in natural language.

What’s more, a voice assistant not only helps you serve a better user experience, but you also get your hands on critical data points that can be further used to enhance your services. Considering that 40% of adults use voice search once daily, it’s easy to see what kind of data you can gather by adding a voice assistant to your ecommerce store. 

Savings On Support Costs

Having a voice assistant means having a customer service team 24/7. A voice assistant provides automated customer support to your users, and with less costs. They can take care of most of their queries, thus delivering a higher response time and speeding up the resolution time. 

This is why 93% of consumers are satisfied with the services provided by their voice assistants. Further, around 50% of consumers feel organized, 45% feel informed, and 37% feel happy with the help of these voice assistants. 

Conclusion

Voice assistants are a necessity to stay ahead of the competition and deliver a best in class user experience. A voice assistant not only helps you bring down costs, but it also enhances customer satisfaction and improves the performance of your customer support team.

Go ahead, and add a voice assistant to your app. How, you ask? Alan AI is here to help. Alan is a conversational voice AI platform that simplifies the entire process of adding a voice assistant to your application. Contact us to learn more about our services and how we could help you realize your goals. 

]]>
https://alan.app/blog/how-voice-assistants-increase-revenue-and-usability-of-e-commerce-apps/feed/ 0 4197
Alan Studio Walkthrough: Part 1 https://alan.app/blog/alan-studio-walk-through-blog-1/ https://alan.app/blog/alan-studio-walk-through-blog-1/#respond Fri, 08 Nov 2019 01:12:34 +0000 https://alan.app/blog/?p=2531 Part 1 This is the first in a three part series how to get started with the Alan Platform. To begin, visit https://studio.alan.app/register to create your Alan Studio account. Once you create your account and verify your email, it will direct you to the main project page, so let’s take...]]>

Part 1

This is the first in a three part series how to get started with the Alan Platform.

If you would like to follow along with this tutorial yourself, all the files necessary will be available on our GitHub, and you can also follow along using this video tutorial!

To begin, visit https://studio.alan.app/register to create your Alan Studio account. Once you create your account and verify your email, it will direct you to the main project page, so let’s take a look!

Project Page

Once you login, Alan will direct you to: https://studio.alan.app/projects

From here, there are many important things to note.

  • Tutorial: In our Menu Bar up top, you will see a button labeled “Tutorial” This will take you to https://alan.app/blog/docs/intro.html Where you can start with our documentation as well as how to integrate your script on any platform.
  • Create New Project: Click this button to start a new project quickly and easily.
  • Billing: On the top right of our menu bar, you will also see your monthly charge as well as how many free interactions you have left.
  • Menu Dropdown: This dropdown has quick shortcuts to our documentation, billing, and settings page.
  • Current Projects: The majority of this page will be taken up with cards that display your current project as well as quick analytics.

Creating our first project

Now that we are familiar with our project page, let’s get create our first sample project!

Go ahead and Click “Create New Project”, for this tutorial we are going to name our project, “Food Ordering”.

Scripting UI

Our Scripting page is the main page where you will do all of your scripting and project work that divides into five main sections:

  • The menu bar at the top
  • Our Scripts Navigation pane on the left
  • Our Script Development Window in the middle
  • The Debugging Pane on the right
  • The Logs bar featuring all input/output phrases and unrecognized phrases.

Script Basics

For this tutorial, we are going to be focusing on creating a fully voice enabled Food Ordering application. You will notice that the Script Development window is prompting us to create a new script, so let’s go ahead and add one now.

Click the “Create New Script” button and we will add a predefined script template called “Food_Ordering”.

Quick Tip: Go through our predefined scripts to learn more about the features of Alan and generate new script ideas!

Once you add your new script, you will see it open in our main window. The Source Code for this application is also available in our GitHub so you can download and follow along.

Let’s try out this script by clicking on the Alan Button and saying, “Order two pepperoni pizzas”. 

From here, we can see how Alan associates our keywords with:

An intent on line 296:

intent(`(add|I want|order|get|and|) $(NUMBER) $(ITEM ${ITEMS_INTENT})`,

And a response on line 351:

p.play(answer);

A sample with more details on the Answer function is found on line 320.

If you look in the debugging chat, you can see the actual instructions that are being sent to the application in order to achieve commands.

Now that we have created our first project and understand the basics of Voice Scripts, we’ll give you some time to play around with your project and adjust the scripts as you wish. We’ll see you in the next blog post where we will discuss more about customizing scripts, version control, development stages, and logs.

]]>
https://alan.app/blog/alan-studio-walk-through-blog-1/feed/ 0 2531
What is a voice assistant? https://alan.app/blog/voiceassistant-2/ https://alan.app/blog/voiceassistant-2/#comments Fri, 25 Oct 2019 16:58:00 +0000 https://alan.app/blog/?p=2461 A voice assistant is a digital assistant that uses voice recognition, language processing algorithms, and voice synthesis to listen to specific voice commands and return relevant information or perform specific functions as requested by the user. Based on specific commands, sometimes called intents, spoken by the user, voice assistants can...]]>

A voice assistant is a digital assistant that uses voice recognition, language processing algorithms, and voice synthesis to listen to specific voice commands and return relevant information or perform specific functions as requested by the user.

Based on specific commands, sometimes called intents, spoken by the user, voice assistants can return relevant information by listening for specific keywords and filtering out the ambient noise.

While voice assistants can be completely software based and able to integrate into most devices, some assistants are designed specifically for single device applications, such as the Amazon Alexa Wall Clock. 

Today, voice assistants are integrated into many of the devices we use on a daily basis, such as cell phones, computers, and smart speakers. Because of their wide array of integrations, There are several voice assistants who offer a very specific feature set, while some choose to be open ended to help with almost any situation at hand.

History of voice assistants

Voice assistants have a very long history that actually goes back over 100 years, which might seem surprising as apps such as Siri have only been released within the past ten years.

The very first voice activated product was released in 1922 as Radio Rex. This toy was very simple, wherein a toy dog would stay inside a dog house until the user exclaimed its name, “Rex” at which point it would jump out of the house. This was all done by an electromagnet tuned to the frequency similar to the vowel found in the word Rex, and predated modern computers by over 20 years.

At the 1952 World’s fair, Audrey was announced by Bell Labs. The Automatic Digit Recognizer was not a small simple device however, its casing stood six feet tall just to house all the materials required to recognize ten numbers!

IBM began their long history of voice assistants in 1962 at the World’s Fair in Seattle when IBM Shoebox was announced. This device was able to recognize digits 0-9 and six simple commands such as, “plus, minus” so the device could be used as a simple calculator. Its name referred to its size, similar to the average shoebox, and contained a microphone connected to three audio filters to match the electric frequencies of what was being said and matched it with already assigned values for each digit.

Darpa then funded five years of speech recognition R&D in 1971, known as the Speech Understanding Research (SUR) Program. One of the biggest innovations to come out if this was Carnegie Mellon’s Harpy, which was capable of understanding over 1,000 words.

The next decade led to amazing progress and research in the speech recognition field, leading most voice recognition devices from understanding a few hundred words to understanding thousands, and slowly making their way into consumers homes.

Then, in 1990, Dragon Dictate was introduced to consumers homes for the shocking price of $9,000! This was the first consumer oriented speech recognition program designed for home PC’s. The user could dictate to the computer one word at a time, pausing in between each word waiting for the computer to process before they could move on. Seven years later, Dragon NaturallySpeaking was released and it brought more natural conversation, able to understand continuous speech at a maximum of 100 words per minute and a much lower price tag of $695.

In 1994, Simon by IBM was the first smart voice assistant. Simon was a PDA, and really, the first smartphone in history, considering it predates HTC’s Droid by practically 25 years!

In 2008, when Android was first released, Google had slowly started rolling out voice search for its Google mobile apps on various platforms, with a dedicated Google Voice Search Application being released in 2011. This led to more and more advanced features, eventually leading to Google now and Google Voice Assistant.

Then, this was followed by Siri in 2010. Developed by SRI International with speech recognition provided by Nuance Communications, the original app was released in 2010 on the iOS App Store and was acquired two months later by Apple. Then, with the release of the iPhone 4s, Siri was officially released as an integrated voice assistant within iOS. Since then, Siri has made its way to every Apple device available and has linked all the devices together in a  single ecosystem.  

Shortly after Siri was first developed, IBM Watson is announced publicly in 2011. Watson was named after the founder of IBM, and was originally conceived in 2006 to beat humans at a game of Jeopardy. Now, Watson is one of the most intelligent, naturally speaking computer systems available.

Amazon Alexa is then announced in 2015. It’s name being inspired by the Library of Alexandria and also the hard consonant “X” in the name, helping with more accurate voice recognition. With Alexa, the Echo line of smart devices are announced to bring smart integration to consumers homes for an inexpensive route.

Alan is finally publicly announced in 2017 to take the Enterprise Application world by storm. Being first born as “Synqq”, Alan is created by the minds behind “Qik”, the very first video messaging and conferencing mobile app. Alan is the first voice AI platform aimed at enterprise applications, so while it can be found in many consumer applications, it is designed for enterprises to be able to develop and integrate quickly and efficiently!

At the bottom of the post we’ve included a Timeline to summarize the history of voice assistants!

Technology behind Voice Assistants

Voice assistants use Artificial Intelligence and Voice recognition to accurately and efficiently deliver the result that the user is looking for. While it may seem simple to ask a computer to set a timer, the technology behind it is fascinating.

Voice Recognition

Voice recognition works by taking an analog signal from a users voice and turning it into a digital signal. After doing this, the computer takes the digital signal and attempts to match it up to words and phrases to recognize the users intent. To do this, the computer requires a database of pre-existing words and syllables in a given language to be able to closely match the digital signal with. Checking the input signal with this database is known as pattern recognition, and is the primary force behind voice recognition.

Artificial Intelligence

Artificial intelligence is using machines to simulate and replicate human intelligence.

In 1950, Alan Turing (The namesake of our company) published his paper “Computing Machinery and Intelligence” that first asked the question, can machines think? Alan Turing then went on to develop the Turing Test, a method of evaluating a computer to test its capability of thinking like a human. There were four approaches later developed that defined AI, Thinking humanly/rationally, and acting humanly/rationally. While the first two deal with reasoning, the second two deal with actual behavior. Modern AI is typically seen as a computer system designed to accomplish tasks that typically require human interaction. These systems can improve upon themselves using a process known as machine learning.

Machine Learning

Machine learning refers to the subset of Artificial Intelligence where programs are created without the use of human coders manually creating the program. Instead of writing out the complete program on their own, programmers gives the AI “patterns” to recognize and learn from and then gives the AI large amounts of data to sift through and study. So instead of having specific rules to abide by, the AI searches for patterns within this data and uses it to improve its already existing functions. One way machine learning can be helpful for Voice AI, is by feeding the algorithm hours of speech from various accents and dialects.

While traditional programs requires an input and rules to develop an output, machine learning tools are given an input and an output and use that to create the program itself. There are two approaches to machine learning, supervised learning and unsupervised learning. In supervised learning, the model is given data that is already partly labeled, this means some of the data given will be already tagged with the correct answer. This helps guide the model into categorizing the rest of the data and developing a correct algorithm. In unsupervised learning, none of the data is labeled, so it is up to the model to find the pattern correctly. One of the reasons this is very useful is because it allows the model to find patterns that the creators might have never found on their own, but the data is much more unpredictable.

Different Voice Assistant approaches

Many conversational assistants today combine both a task-oriented and knowledge-oriented workflow to carry out almost any task that a user can throw at it. A task-oriented workflow might include filling out a form, while a knowledge-oriented workflow includes answering what the capital of a state might be or specifying the technical specifications of a product.

Task-oriented approach

A task-oriented approach is using goals to tasks to achieve what the user needs. This approach often integrates itself with other apps to help complete tasks. For example, if you were to ask a voice assistant to set an alarm for 3PM, it would understand this to be a task request and communicate with your default Clock application to open and set an alarm for 3PM. It would then communicate with the app to see if anything else was necessary, such as a name for the alarm, then it would communicate this need back to you. This approach does not require an extensive online database, as it is mainly using the knowledge and already existing skills of other installed applications.

Knowledge-oriented approach

A knowledge-oriented approach is the use of analytical data to help users with their tasks. This approach focuses on using online databases and already recorded knowledge to help complete tasks. An example of this approach is anytime a user asks for an internet search, it will use the online databases available to return relevant results and recommend the highest search result. If someone is searching up a trivia question, this would use a knowledge-oriented approach as it is searching for data instead of working with other apps to complete tasks.

Benefits of Voice Assistants

Some examples of what a Voice Assistant can do include:

  • Check the weather
  • Turn on/off connected smart devices
  • Search databases

One of the main reasons of the growing popularity of Voice User Interfaces (VUI) is due to the growing complexity within mobile software without an increase in screen size, leading to a huge disadvantage by using a GUI (Graphical User Interface). As more iterations of phones come out, the screen sizes stay relatively the same, leading for very cramped interfaces and creating frustrating user experiences, which is why more and more developers are switching to Voice User Interfaces.

Efficiency and Safety

While typing has become much faster as people have gotten used to using standard keyboards, using your voice will always be quicker, much more natural, and lead to less spelling errors. This leads to a much more efficient and natural intelligent workflow.

Quick learning curve

One of the greatest benefits of voice assistants is a quick learning curve. Instead of having to learn how to use devices like mice and touch screens and get used to using specific physical devices, you can just use your natural conversation tendencies and use your voice.

Wider Device Integration

Since a screen or keyboard isn’t necessary, it’s easy to place voice integration into a much wider array of devices. In the future, smart glasses, furniture, appliances, will all come with voice assistants already integrated into the device.

Why and When to use Voice Assistants

There are many use cases for using a voice assistant in todays’ world. For example, when your hands are full and you are unable to use a touch screen or keyboard, or when you are driving Let’s say you are driving and you need to change your music, you could just ask a voice assistant, “play my driving playlist”. This leads to a safer driving experience, and helps avoid the risk of distracted driving.

User Interfaces

To further understand voice assistants, it is important to take a look at the overall user Experience and what a User Interface is and how a VUI differs from a more traditional graphical user Interface that modern apps currently use. 

Graphical User Interface (GUI)

A Graphical User Interface is what is most commonly used today. For example, the internet browser you’re using to read this article is a graphical user interface. Using graphical icons and visual indicators, the user is able to interact with machines quicker and easier than before.

A Graphical User Interface can be used in something like a chatbot, where the user communicates with the device over text, and the machine responds with natural conversation text. The big downside to this is since it is done all in text, it can seem cumbersome and inefficient, and can take longer than voice in certain situations.

Voice User Interface (VUI)

An example of a VUI is something like Siri, where there is an audio cue that the device is listening, followed by a verbal response.

Most apps today combine a sense of both Graphical and Voice User Interfaces. For example, when using a maps application, you can use voice to search for destinations and the application will show you the most relevant results, placing the most important information at the top of the screen.

Some examples of popular smart assistants today are Alan, Amazon Alexa, Siri by Apple, and Google Voice Assistant.

Popular Voice Assistants

Voice Assistant adoption by platform, from Voicebot.ai

Siri

Siri is the most popular voice assistant today. Created in 2010 by SRI Inc, and purchased in 2011 by Apple, Siri has quickly become an integral part of the Apple ecosystem in bringing all the Apple devices and applications together to use in tandem with one another.

Alexa

Created by Amazon in 2014, Alexa was named due to its similarity to the Library of Alexandria. Alexa was originally inspired by the conversational voice system found on board the U.S.S. Enterprise in Star Trek. Alexa was released alongside The Amazon Echo, a smart speaker intended for consumers to dive into the world of home automation, uses the Alexa platform to allow users to interact with the Amazon ecosystem and allow for a plethora of smart devices to be connected.

Google Assistant

Originally unveiled in 2016, Google Assistant was the spiritual successor of Google Now, with the main improvement being the addition of two-way conversations. Where Google now would return answers in the form of a search results page on Google, Google Assistant gives answers in the form of natural sentences and returns recommendations in the form of Feature cards.

Cortana

Beginning in 2009, Cortana by Microsoft has had one of the longest visions of giving people access to voice assistants in their daily lives. Microsoft began shipping Cortana with all Windows 10 and Xbox devices, leading to a huge increase in the amount of registered Cortana users. In 2018 it was reported that Cortana had over 800 Million users.

Alan

In 2017 Alan set out to take voice assistants to the next level, by enabling voice AI for all applications. Using domain specific language models and contextual understanding, Alan is focused on creating a new generation of Enterprise Voice AI applications. By using the Alan Platform, developers are able to take control of voice, and create an effective workflow that best fits their users with the help of vocal commands.

Future of Voice Assistants

As AI becomes more advanced and voice technology becomes more accepted, not only will voice controlled digital assistants become more natural, they will also become more integrated into more daily devices. Also, conversations will become much more natural, emulating human conversations, which will begin to introduce more complex task flows. More and more people are using voice assistants too, as it was estimated in early 2019 that 111.8 million people in the US will use a voice assistant at least monthly, up 9.5% from last year. 

Further Integration

In the future, devices will be more integrated with voice, and it will become easier and easier to search using voice. For example, Amazon has already released a wall clock that comes enabled with Amazon Alexa, so you can ask it to set a timer or tell you the time. While these devices aren’t full blown voice activated personal assistants, they still show a lot of promise in the coming years. Using vocal commands, we will be able to work with our devices just by talking.

Natural Conversations

Currently, as users are getting more used to using voice to communicate with their digital devices, conversations can seem very broken and awkward. But in the future, as digital processing becomes quicker and people become more accustomed to using voice assistants in their everyday devices, we will see a shift where users won’t have to pause and wait for the voice assistant to catch up, and instead we will be able to have natural conversations with our voice assistants, creating a more soothing and natural experience.

More complex task flows

As conversations with voice assistants become more natural and voice recognition and digital processing becomes quicker, it won’t be uncommon to see users begin to adopt more advanced tasks in their daily routines with voice assistants. For example, instead of asking a voice assistant how long a commute is, and then asking about different options, you might be more inclined to say, “If Uber is quicker than taking the bus to work, can you reserve an Uber ride from home to work, and how long will it take?”

How to make your own voice assistant

As the amount of voice assistants available publicly begin to grow, tools are beginning to appear to create your own to make it as easy as possible to find a voice assistant that fits your needs!

For example, if you just wanted to create a specific skill, or command for a voice assistant, it might be more efficient to look into integrating a skill into an already existing voice assistant, such as Alexa. 

Amazon has actually made it incredibly simple to add your own command to the vastly growing set of publicly available Alexa Skills. You can login to AWS with the same account you have an Echo linked to, and use the tools to create a free Alexa Skill! 

Using Alan Studio, the completely browser based Voice AI IDE, you can develop, test, and push voice integration straight from your browser.

Why Alan?

Alan is a highly customizable Voice AI platform designed to work with any pre-existing application. Built with enterprise use in mind, security and business functionality are a top priority. You can leverage visual and voice context to support any workflow and improve efficiency today, and since Alan is a completely browser based IDE, you can edit your scripts on the go whenever the need arises. Long gone are the days of creating multiple versions of scripts to run on each platform, with Alan, you can use a single script version and embed into any app, iOS, Android, or Web. You can sign up today for Alan Studio and see how you can create an AI voice assistant solution to improve your quality of life!

The Alan Voice AI Platform
Click the Alan button to learn more!

Voice Assistant Timeline

  • 1922 – First Voice activated consumer product hits store shelves as “Radio Rex”
  • 1952 – Audrey, or the Automatic Digit Recognition Machine, is announced
  • 1962 – IBM Shoebox is shown for the first time at the State Fair
  • 1971 – Darpa funds five years of speech recognition research and development
  • 1976 – Harpy is shown at Carnegie Mellon
  • 1984 – IBM releases “Tangora” the first voice activated typewriter
  • 1990 – Dragon Dictate is released
  • 1994 – Simon by IBM is the first modern voice assistant released
  • 2010 – Siri is released as an app on the iOS app store
  • 2011 – IBM Watson is released
  • 2012 – Google Now is released
  • 2014 – Amazon Alexa and Echo are released
  • 2015 – Microsoft Cortana is released
  • 2017 – Alan is developed and released with the Alan Platform
From Voicebot.ai

Resources

https://whatis.techtarget.com/definition/voice-assistant

https://www.smartsheet.com/voice-assistants-artificial-intelligence

https://www.ibm.com/ibm/history/ibm100/us/en/icons/speechreco

http://www.bbc.com/future/story/20170214-the-machines-that-learned-to-listen

https://towardsdatascience.com/build-your-first-voice-assistant-85a5a49f6cc1

This article was reposted at dev.to here:
https://dev.to/alanvoiceai/what-is-a-voice-assistant-492p

]]>
https://alan.app/blog/voiceassistant-2/feed/ 2 2461
Add voice to your app in 10 minutes! https://alan.app/blog/add-voice-to-your-app-in-10-minutes/ https://alan.app/blog/add-voice-to-your-app-in-10-minutes/#respond Tue, 17 Sep 2019 21:00:00 +0000 http://alan.app/blog/?p=2348 Join us on Thursday, September 19th at 10AM, PST as we walk YOU through Alan and show you how you can add voice to your app in 10 minutes! Chat with Alan Developers and ask questions live!]]>
Thursday, September 19th at 10AM, PST

Have you ever wanted to learn more about adding voice to your app but weren’t sure who to ask? Now is the perfect time!

Join us on Thursday, September 19th at 10AM, PST as we walk YOU through Alan and show you how you can add voice to your app in 10 minutes! Chat with Alan Developers and ask questions live!

Sign up today at:

https://zoom.us/webinar/register/6115686541869/WN_mnocfxtfTRWc9d7y16z7Rg

]]>
https://alan.app/blog/add-voice-to-your-app-in-10-minutes/feed/ 0 2348
Podcast: This Week in Voice https://alan.app/blog/podcast-twin/ https://alan.app/blog/podcast-twin/#respond Tue, 17 Sep 2019 16:29:19 +0000 http://alan.app/blog/?p=2329 Brand new episode of This Week in Voice, where our CEO, Ramu Sunkara sits down with Bradley Metrock to discuss the latest in voice. Thank you to Bradley for having us! Check out the episode below:]]>

We are incredibly excited to share with you a brand new episode of This Week in Voice, where our CEO, Ramu Sunkara sits down with Bradley Metrock to discuss the latest in voice. Thank you to Bradley for having us! Check out the episode below:

https://www.thisweekinvoice.com/s4e3-sep-12-2019

The topics discussed in this episode include:

From their website: This Week In Voice is VoiceFirst.FM’s weekly news podcast, bringing you the most interesting, relevant stories in the rapidly-growing world of voice technology. If you like what you heard, you can check out more episodes on Apple Podcasts, Google Play Music, Stitcher RadioSoundcloudTuneIn, and many other preferred podcast providers.

The host, Bradley Metrock, is the CEO of Score Publishing, as well as the Executive Producer of Project Voice, the #1 event for voice tech and AI in America, coming the week after CES. Check out projectvoice.ai for more information about the event.

]]>
https://alan.app/blog/podcast-twin/feed/ 0 2329
Voice In Their Apps https://alan.app/blog/voice-in-their-apps/ https://alan.app/blog/voice-in-their-apps/#respond Thu, 18 Jul 2019 21:43:41 +0000 http://alan.app/blog/?p=2214 Voice is in its infancy stage but quickly becoming the way we interact with our devices daily. The majority of voice usage comes from the big platform’s own applications and devices: Siri, Google Assistant, and the Amazon Echo device. More and more consumers have adopted voice, now on over 1B...]]>

Voice is in its infancy stage but quickly becoming the way we interact with our devices daily. The majority of voice usage comes from the big platform’s own applications and devices: Siri, Google Assistant, and the Amazon Echo device. More and more consumers have adopted voice, now on over 1B devices and the fastest growing interface ever. Voice is coming into the mainstream, however, there are a few flaws that prevent it from becoming ubiquitous as an interface: 

  • Companies must rebuild and recreate their existing functionality and branding for voice usage, requiring heavy investment in their chosen assistant, beit Alexa, Siri, or Google.
  • User context is lost after a few commands, where thousands of skills compete for a broad set of commands from users.  
  • Many multi-step voice conversations on these devices aren’t purely voice enabled — users have to go back to using touch and type with several Siri and Google Assistant skills.
  • Privacy and Security are lacking: employees from the big platform’s listen and expose user data to gain a competitive edge in Machine Learning technology. Companies don’t own the IP of what they create.

Demonstration of how developers will monetize their voice apps is also a problem. Currently the big platforms provide rewards to app developers based on their app’s monthly user engagement, as they build the market and find a way to long term monetization.
Enterprises are creating voice apps on these platforms with limited control of the experience, data, and technical investment. Despite this, voice is an important part of the future. The difficulties posed for enterprises will be eliminated. The future of voice is coming.

]]>
https://alan.app/blog/voice-in-their-apps/feed/ 0 2214
Add Visual Voice Experiences to your SAP Mobile Applications https://alan.app/blog/add-a-visual-voice-experience-to-your-sap-mobile-applications-with-alan/ https://alan.app/blog/add-a-visual-voice-experience-to-your-sap-mobile-applications-with-alan/#respond Fri, 17 May 2019 18:05:05 +0000 http://alan.app/blog/?p=2041 Recently, we created a full visual voice experience for the SAP sample application provided through the SAP Cloud Platform SDK for iOS. We did this with the new integration from the Alan Voice AI platform. Here, we’ll go over the steps we took to create this visual voice experience. You can find the full source code of this application and the supporting Alan Visual Voice scripts here.

1. Download and Install the SAP Cloud Platform SDK for iOS

Head over to SAP’s Developer page and click on “SAP Cloud Platform SDK for iOS”. Click on the top link there to download the SAP Cloud Platform SDK for iOS. Add to your Applications folder and open the Application.

2. Create the SAP Sample Application

Now, open the SAP Cloud Platform SDK on your computer, click “Create new”, then click “Sample Application”. Then follow the steps to add your SAP account, Application details, and the name of your Xcode project. This will create an Xcode project with the Sample application.

Once this is done, open the Xcode project and take a look around. Build the project and you can see it’s an application with Suppliers, Categories, Products, and Ordering information.

Now let’s integrate with Alan.

3. Integrate the application with Alan Platform

Go to Alan Studio at https://studio.alan.app. If you don’t have an account, create one to get started.

Once you login, create a Project named “SAP”. Now, we’re just going to be integrating our SAP sample application with Alan. Later we will create the voice experience.

At the top of the screen, switch from “Development” to “Production”. Now open the “Embed  Code </>” menu, then click on the “iOS” tab and review the steps.

Then, Download the iOS SDK Framework. Once you download go back to your Xcode project.In your Xcode project, create a new group named “Alan”. Drag and drop the iOS SDK Framework into this group.

Next, go to the “Embedded Binaries” section and add the SDK Framework. Make sure that you also have the framework in your project’s “Linked Frameworks and Libraries” section as well.

Now, we need to show a message asking for microphone access. To do this, go to the file info.plist. In the “Key” column, right click and select “Add Row”. For the name of the Key, input “NSMicrophoneDescription”. The value here will be the message that your users will see when they press on the Alan button for this first time in the application. For this, use the message “Alan needs microphone access to provide the voice experience for this application” or something similar.

Go back to the group you created earlier named “Alan,” right click it, and select “New File”. Select “Swift” as the filetype and name it “WindowUI+Alan”. All of the Alan button’s functions will be stored in this file, including the size, color styles, and voice states. You can find the code for this file here:

[code language="objc" collapse="true" title="WindowUI+Alan.swift"]
//
//  UIWindow+Alan.swift
//  MyDeliveries
//
//  Created by Sergey Yuryev on 22/04/2019.
//  Copyright © 2019 SAP. All rights reserved.
//

import UIKit
import AlanSDK

public final class ObjectAssociation&amp;amp;amp;amp;amp;lt;T: Any&amp;amp;amp;amp;amp;gt; {
    
    private let policy: objc_AssociationPolicy
    
    public init(policy: objc_AssociationPolicy = .OBJC_ASSOCIATION_RETAIN_NONATOMIC) {
        self.policy = policy
    }
    
    public subscript(index: AnyObject) -&amp;amp;amp;amp;amp;gt; T? {
        get { return objc_getAssociatedObject(index, Unmanaged.passUnretained(self).toOpaque()) as! T? }
        set { objc_setAssociatedObject(index, Unmanaged.passUnretained(self).toOpaque(), newValue, policy) }
    }
    
}

extension UIWindow {
    
    private static let associationAlanButton = ObjectAssociation&amp;amp;amp;amp;amp;lt;AlanButton&amp;amp;amp;amp;amp;gt;()
    private static let associationAlanText = ObjectAssociation&amp;amp;amp;amp;amp;lt;AlanText&amp;amp;amp;amp;amp;gt;()
    
    var alanButton: AlanButton? {
        get {
            return UIWindow.associationAlanButton[self]
        }
        set {
            UIWindow.associationAlanButton[self] = newValue
        }
    }
    
    var alanText: AlanText? {
        get {
            return UIWindow.associationAlanText[self]
        }
        set {
            UIWindow.associationAlanText[self] = newValue
        }
    }
    
    func moveAlanToFront() {
        if let button = self.alanButton {
            self.bringSubviewToFront(button)
        }
        if let text = self.alanText {
            self.bringSubviewToFront(text)
        }
    }
    
    func addAlan() {
        let buttonSpace: CGFloat = 20
        let buttonWidth: CGFloat = 64
        let buttonHeight: CGFloat = 64
        let textWidth: CGFloat = self.frame.maxX - buttonWidth - buttonSpace * 3
        let textHeight: CGFloat = 64
        
        let config = AlanConfig(key: "", isButtonDraggable: false)

        self.alanButton = AlanButton(config: config)
        if let button = self.alanButton {
            let safeHeight = self.frame.maxY - self.safeAreaLayoutGuide.layoutFrame.maxY
            let realX = self.frame.maxX - buttonWidth - buttonSpace
            let realY = self.frame.maxY - safeHeight - buttonHeight - buttonSpace
            
            button.frame = CGRect(x: realX, y: realY, width: buttonWidth, height: buttonHeight)
            self.addSubview(button)
            self.bringSubviewToFront(button)
        }
        
        self.alanText = AlanText(frame: CGRect.zero)
        if let text = self.alanText {
            let safeHeight = self.frame.maxY - self.safeAreaLayoutGuide.layoutFrame.maxY
            let realX = self.frame.minX + buttonSpace
            let realY = self.frame.maxY - safeHeight - textHeight - buttonSpace
            
            text.frame = CGRect(x: realX, y: realY, width: textWidth, height: textHeight)
            self.addSubview(text)
            self.bringSubviewToFront(text)
            
            text.layer.shadowColor = UIColor.black.cgColor
            text.layer.shadowOffset = CGSize(width: 0, height: 0)
            text.layer.shadowOpacity = 0.3
            text.layer.shadowRadius = 4.0
            
            for subview in text.subviews {
                if let s = subview as? UILabel {
                    s.backgroundColor = UIColor.white
                }
            }
        }
    }
}
[/code]

The next thing to do is to open the projects “ApplicationUIManager.swift” file and add a few methods required to use the voice button in the application. Here are the sections that each method should be added to:

[code language="objc" collapse="true" title="ApplicationUIManager.swift" highlight="28,92"]
//
// AlanDeliveries
//
// Created by SAP Cloud Platform SDK for iOS Assistant application on 24/04/19
//

import SAPCommon
import SAPFiori
import SAPFioriFlows
import SAPFoundation

class SnapshotViewController: UIViewController {}

class ApplicationUIManager: ApplicationUIManaging {
    // MARK: –&amp;amp;amp;amp;amp;nbsp;Properties

    let window: UIWindow

    /// Save ViewController while splash/onboarding screens are presented
    private var _savedApplicationRootViewController: UIViewController?
    private var _onboardingSplashViewController: (UIViewController &amp;amp;amp;amp;amp;amp; InfoTextSettable)?
    private var _coveringViewController: UIViewController?

    // MARK: – Init

    public init(window: UIWindow) {
        self.window = window
        self.window.addAlan()
    }

    // MARK: - ApplicationUIManaging

    func hideApplicationScreen(completionHandler: @escaping (Error?) -&amp;amp;amp;amp;amp;gt; Void) {
        // Check whether the covering screen is already presented or not
        guard self._coveringViewController == nil else {
            completionHandler(nil)
            return
        }

        self.saveApplicationScreenIfNecessary()
        self._coveringViewController = SnapshotViewController()
        self.window.rootViewController = self._coveringViewController

        completionHandler(nil)
    }

    func showSplashScreenForOnboarding(completionHandler: @escaping (Error?) -&amp;amp;amp;amp;amp;gt; Void) {
        // splash already presented
        guard self._onboardingSplashViewController == nil else {
            completionHandler(nil)
            return
        }

        setupSplashScreen()

        completionHandler(nil)
    }

    func showSplashScreenForUnlock(completionHandler: @escaping (Error?) -&amp;amp;amp;amp;amp;gt; Void) {
        guard self._onboardingSplashViewController == nil else {
            completionHandler(nil)
            return
        }

        self.saveApplicationScreenIfNecessary()

        setupSplashScreen()

        completionHandler(nil)
    }

    func showApplicationScreen(completionHandler: @escaping (Error?) -&amp;amp;amp;amp;amp;gt; Void) {
        // Check if an application screen has already been presented
        guard self.isSplashPresented else {
            completionHandler(nil)
            return
        }

        // Restore the saved application screen or create a new one
        let appViewController: UIViewController
        if let savedViewController = self._savedApplicationRootViewController {
            appViewController = savedViewController
        } else {
            let appDelegate = (UIApplication.shared.delegate as! AppDelegate)
            let splitViewController = UIStoryboard(name: "Main", bundle: Bundle.main).instantiateViewController(withIdentifier: "MainSplitViewController") as! UISplitViewController
            splitViewController.delegate = appDelegate
            splitViewController.modalPresentationStyle = .currentContext
            splitViewController.preferredDisplayMode = .allVisible
            appViewController = splitViewController
        }
        self.window.rootViewController = appViewController
        self.window.moveAlanToFront()
        self._onboardingSplashViewController = nil
        self._savedApplicationRootViewController = nil
        self._coveringViewController = nil

        completionHandler(nil)
    }

    func releaseRootFromMemory() {
        self._savedApplicationRootViewController = nil
    }

    // MARK: –&amp;amp;amp;amp;amp;nbsp;Helpers

    private var isSplashPresented: Bool {
        return self.window.rootViewController is FUIInfoViewController || self.window.rootViewController is SnapshotViewController
    }

    /// Helper method to capture the real application screen.
    private func saveApplicationScreenIfNecessary() {
        if self._savedApplicationRootViewController == nil, !self.isSplashPresented {
            self._savedApplicationRootViewController = self.window.rootViewController
        }
    }

    private func setupSplashScreen() {
        self._onboardingSplashViewController = FUIInfoViewController.createSplashScreenInstanceFromStoryboard()
        self.window.rootViewController = self._onboardingSplashViewController

        // Set the splash screen for the specific presenter
        let modalPresenter = OnboardingFlowProvider.modalUIViewControllerPresenter
        modalPresenter.setSplashScreen(self._onboardingSplashViewController!)
        modalPresenter.animated = true
    }
}
[/code]

For the final step of the integration, return to your project in Alan Studio, open the “Embed  Code </>” menu, “iOS” tab, and copy the “Alan SDK Key”. Make sure you copy the “Production” key for this step!

Now go back to your Xcode project’s “WindowUI+Alan.swift” file. Paste key into the quotes between the quotes in the line let config = AlanConfig(key: ””, isButtonDraggable: false)

It’s time to Build the application to see how it looks. Press the big Play button in the upper left of Xcode.

See the Alan button in the bottom right of the application. Now it’s time to create the full Visual Voice experience for the application.

4. Create the Visual Voice experience in Alan

The Visual Voice experience for this application will let users ask about products, orders, and suppliers. We’ve already created the scripts for this, which you can find here. Take these scripts and copy and paste them into your project within Alan and save. You’ll want to create a new version with this script and put it on “Production”.

Now that that’s done, we need to add handlers in the application which will control the application with voice commands. Note that the handlers for your application will be slightly different. Here are examples of our handlers:

[code language="objc" collapse="true" title="WindowUI+Alan.swift" highlight="27-36,40,41,45-61,90-113,157,160-228"]
//
//  UIWindow+Alan.swift
//  MyDeliveries
//
//  Created by Sergey Yuryev on 22/04/2019.
//  Copyright © 2019 SAP. All rights reserved.
//

import UIKit
import AlanSDK

public final class ObjectAssociation&amp;amp;amp;amp;amp;amp;amp;amp;lt;T: Any&amp;amp;amp;amp;amp;amp;amp;amp;gt; {
    
    private let policy: objc_AssociationPolicy
    
    public init(policy: objc_AssociationPolicy = .OBJC_ASSOCIATION_RETAIN_NONATOMIC) {
        self.policy = policy
    }
    
    public subscript(index: AnyObject) -&amp;amp;amp;amp;amp;amp;amp;amp;gt; T? {
        get { return objc_getAssociatedObject(index, Unmanaged.passUnretained(self).toOpaque()) as! T? }
        set { objc_setAssociatedObject(index, Unmanaged.passUnretained(self).toOpaque(), newValue, policy) }
    }
    
}

protocol ProductViewDelegate {
    func highlightProductId(_ id: String?)
    func showProductCategory(_ category: String)
    func showProductIds(_ ids: [String])
}

protocol NavigateViewDelegate {
    func navigateCategory(_ category: String)
    func navigateBack()
}

extension UIWindow {
    
    private static let navigateDelegate = ObjectAssociation&amp;amp;amp;amp;amp;amp;amp;amp;lt;NavigateViewDelegate&amp;amp;amp;amp;amp;amp;amp;amp;gt;()
    private static let productDelegate = ObjectAssociation&amp;amp;amp;amp;amp;amp;amp;amp;lt;ProductViewDelegate&amp;amp;amp;amp;amp;amp;amp;amp;gt;()
    private static let associationAlanButton = ObjectAssociation&amp;amp;amp;amp;amp;amp;amp;amp;lt;AlanButton&amp;amp;amp;amp;amp;amp;amp;amp;gt;()
    private static let associationAlanText = ObjectAssociation&amp;amp;amp;amp;amp;amp;amp;amp;lt;AlanText&amp;amp;amp;amp;amp;amp;amp;amp;gt;()
    
    var navigateViewDelegate: NavigateViewDelegate? {
        get {
            return UIWindow.navigateDelegate[self]
        }
        set {
            UIWindow.navigateDelegate[self] = newValue
        }
    }
    
    var productViewDelegate: ProductViewDelegate? {
        get {
            return UIWindow.productDelegate[self]
        }
        set {
            UIWindow.productDelegate[self] = newValue
        }
    }
    
    var alanButton: AlanButton? {
        get {
            return UIWindow.associationAlanButton[self]
        }
        set {
            UIWindow.associationAlanButton[self] = newValue
        }
    }
    
    var alanText: AlanText? {
        get {
            return UIWindow.associationAlanText[self]
        }
        set {
            UIWindow.associationAlanText[self] = newValue
        }
    }
    
    func moveAlanToFront() {
        if let button = self.alanButton {
            self.bringSubviewToFront(button)
        }
        if let text = self.alanText {
            self.bringSubviewToFront(text)
        }
    }
    
    func setVisual(_ data: [String: Any]) {
        print("setVisual: \(data)");
        if let button = self.alanButton {
            button.setVisual(data)
        }
    }
    
    func playText(_ text: String) {
        if let button = self.alanButton {
            button.playText(text)
        }
    }
    
    func playData(_ data: [String: String]) {
        if let button = self.alanButton {
            button.playData(data)
        }
    }
    
    func call(method: String, params: [String: Any], callback:@escaping ((Error?, String?) -&amp;amp;amp;amp;amp;amp;amp;amp;gt; Void)) {
        if let button = self.alanButton {
            button.call(method, withParams: params, callback: callback)
        }
    }
    
    func addAlan() {
        let buttonSpace: CGFloat = 20
        let buttonWidth: CGFloat = 64
        let buttonHeight: CGFloat = 64
        let textWidth: CGFloat = self.frame.maxX - buttonWidth - buttonSpace * 3
        let textHeight: CGFloat = 64
        
        let config = AlanConfig(key: "", isButtonDraggable: false)

        self.alanButton = AlanButton(config: config)
        if let button = self.alanButton {
            let safeHeight = self.frame.maxY - self.safeAreaLayoutGuide.layoutFrame.maxY
            let realX = self.frame.maxX - buttonWidth - buttonSpace
            let realY = self.frame.maxY - safeHeight - buttonHeight - buttonSpace
            
            button.frame = CGRect(x: realX, y: realY, width: buttonWidth, height: buttonHeight)
            self.addSubview(button)
            self.bringSubviewToFront(button)
        }
        
        self.alanText = AlanText(frame: CGRect.zero)
        if let text = self.alanText {
            let safeHeight = self.frame.maxY - self.safeAreaLayoutGuide.layoutFrame.maxY
            let realX = self.frame.minX + buttonSpace
            let realY = self.frame.maxY - safeHeight - textHeight - buttonSpace
            
            text.frame = CGRect(x: realX, y: realY, width: textWidth, height: textHeight)
            self.addSubview(text)
            self.bringSubviewToFront(text)
            
            text.layer.shadowColor = UIColor.black.cgColor
            text.layer.shadowOffset = CGSize(width: 0, height: 0)
            text.layer.shadowOpacity = 0.3
            text.layer.shadowRadius = 4.0
            
            for subview in text.subviews {
                if let s = subview as? UILabel {
                    s.backgroundColor = UIColor.white
                }
            }
        }
        
        NotificationCenter.default.addObserver(self, selector: #selector(self.handleEvent(_:)), name:NSNotification.Name(rawValue: "kAlanSDKEventNotification"), object:nil)
    }
    
    @objc func handleEvent(_ notification: Notification) {
        guard let userInfo = notification.userInfo else {
            return
        }
        guard let event = userInfo["onEvent"] as? String else {
            return
        }
        guard event == "command" else {
            return
        }
        guard let jsonString = userInfo["jsonString"] as? String else {
            return
        }
        guard let data = jsonString.data(using: .utf8) else {
            return
        }
        guard let unwrapped = try? JSONSerialization.jsonObject(with: data, options: [])  else {
            return
        }
        guard let d = unwrapped as? [String: Any] else {
            return
        }
        guard let json = d["data"] as? [String: Any] else {
            return
        }
        guard let command = json["command"] as? String else {
            return
        }
        
        if command == "showProductCategory" {
            if let value = json["value"] as? String {
                if let d = self.productViewDelegate {
                    d.showProductCategory(value)
                }
            }
        }
        else if command == "showProductIds" {
            if let value = json["value"] as? [String] {
                if let d = self.productViewDelegate {
                    d.showProductIds(value)
                }
            }
        }
        else if command == "highlightProductId" {
            if let value = json["value"] as? String {
                if let d = self.productViewDelegate {
                    d.highlightProductId(value)
                }
            }
            else {
                if let d = self.productViewDelegate {
                    d.highlightProductId(nil)
                }
            }
        }
        else if command == "navigate" {
            if let value = json["screen"] as? String {
                if let d = self.navigateViewDelegate {
                    d.navigateCategory(value)
                }
            }
        }
        else if command == "goBack" {
            if let d = self.navigateViewDelegate {
                d.navigateBack()
            }
        }
    }
}
[/code]
[code language="objc" collapse="true" title="ProductMasterViewController.swift" highlight="13,25,29-31,36-39,115-165"]
//
// AlanDeliveries
//
// Created by SAP Cloud Platform SDK for iOS Assistant application on 24/04/19
//

import Foundation
import SAPCommon
import SAPFiori
import SAPFoundation
import SAPOData

class ProductMasterViewController: FUIFormTableViewController, SAPFioriLoadingIndicator, ProductViewDelegate {
    var espmContainer: ESPMContainer&amp;amp;amp;amp;amp;amp;lt;OnlineODataProvider&amp;amp;amp;amp;amp;amp;gt;!
    public var loadEntitiesBlock: ((_ completionHandler: @escaping ([Product]?, Error?) -&amp;amp;amp;amp;amp;amp;gt; Void) -&amp;amp;amp;amp;amp;amp;gt; Void)?
    private var entities: [Product] = [Product]()
    private var allEntities: [Product] = [Product]()
    private var entityImages = [Int: UIImage]()
    private let logger = Logger.shared(named: "ProductMasterViewControllerLogger")
    private let okTitle = NSLocalizedString("keyOkButtonTitle",
                                            value: "OK",
                                            comment: "XBUT: Title of OK button.")
    var loadingIndicator: FUILoadingIndicatorView?

    var highlightedId: String?
    
    override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)
        if let window = UIApplication.shared.keyWindow {
            window.productViewDelegate = nil
        }
    }
    
    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)
        if let window = UIApplication.shared.keyWindow {
            window.setVisual(["screen": "Product"])
            window.productViewDelegate = self
        }
    }
    
    override func viewDidLoad() {
        super.viewDidLoad()
        self.edgesForExtendedLayout = []
        // Add refreshcontrol UI
        self.refreshControl?.addTarget(self, action: #selector(self.refresh), for: UIControl.Event.valueChanged)
        self.tableView.addSubview(self.refreshControl!)
        // Cell height settings
        self.tableView.rowHeight = UITableView.automaticDimension
        self.tableView.estimatedRowHeight = 98
        self.updateTable()
    }

    var preventNavigationLoop = false
    var entitySetName: String?

    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)
        self.clearsSelectionOnViewWillAppear = self.splitViewController!.isCollapsed
    }

    override func didReceiveMemoryWarning() {
        super.didReceiveMemoryWarning()
        // Dispose of any resources that can be recreated.
    }

    // MARK: - Table view data source

    override func tableView(_: UITableView, numberOfRowsInSection _: Int) -&amp;amp;amp;amp;amp;amp;gt; Int {
        return self.entities.count
    }

    override func tableView(_: UITableView, canEditRowAt _: IndexPath) -&amp;amp;amp;amp;amp;amp;gt; Bool {
        return true
    }

    override func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -&amp;amp;amp;amp;amp;amp;gt; UITableViewCell {
        let product = self.entities[indexPath.row]
        let cell = CellCreationHelper.objectCellWithNonEditableContent(tableView: tableView, indexPath: indexPath, key: "ProductId", value: "\(product.productID!)")
        cell.preserveDetailImageSpacing = true
        cell.headlineText = product.name
        cell.footnoteText = product.productID
        let backgroundView = UIView()
        backgroundView.backgroundColor = UIColor.white
        
        if let image = image(for: indexPath, product: product) {
            cell.detailImage = image
            cell.detailImageView.contentMode = .scaleAspectFit
        }
        if let hid = self.highlightedId, let current = product.productID, hid == current {
            backgroundView.backgroundColor = UIColor(red: 235 / 255, green: 245 / 255, blue: 255 / 255, alpha: 1.0)
        }
        cell.backgroundView = backgroundView
        return cell
    }

    override func tableView(_ tableView: UITableView, commit editingStyle: UITableViewCell.EditingStyle, forRowAt indexPath: IndexPath) {
        if editingStyle != .delete {
            return
        }
        let currentEntity = self.entities[indexPath.row]
        self.espmContainer.deleteEntity(currentEntity) { error in
            if let error = error {
                self.logger.error("Delete entry failed.", error: error)
                AlertHelper.displayAlert(with: NSLocalizedString("keyErrorDeletingEntryTitle", value: "Delete entry failed", comment: "XTIT: Title of deleting entry error pop up."), error: error, viewController: self)
            } else {
                self.entities.remove(at: indexPath.row)
                tableView.deleteRows(at: [indexPath], with: .fade)
            }
        }
    }

    // MARK: - Data accessing
    
    func highlightProductId(_ id: String?) {
        self.highlightedId = id
        DispatchQueue.main.async {
            self.tableView.reloadData()
            self.logger.info("Alan: Table updated successfully!")
        }
    }

    internal func showProductCategory(_ category: String) {
        if category == "All" {
            self.entityImages.removeAll()
            self.entities.removeAll()
            self.entities.append(contentsOf: self.allEntities)
        }
        else {
            let filtered = self.allEntities.filter {
                if let c = $0.category, c == category {
                    return true
                }
                return false
            }
            self.entityImages.removeAll()
            self.entities.removeAll()
            self.entities.append(contentsOf: filtered)
        }
        DispatchQueue.main.async {
            let range = NSMakeRange(0, self.tableView.numberOfSections)
            let sections = NSIndexSet(indexesIn: range)
            self.tableView.reloadSections(sections as IndexSet, with: .automatic)
            self.logger.info("Alan: Table updated successfully!")
        }
        
    }
    
    internal func showProductIds(_ ids: [String]) {
        let filtered = self.allEntities.filter {
            if let productId = $0.productID, ids.contains(productId) {
                return true
            }
            return false
        }
        self.entityImages.removeAll()
        self.entities.removeAll()
        self.entities.append(contentsOf: filtered)
        DispatchQueue.main.async {
            let range = NSMakeRange(0, self.tableView.numberOfSections)
            let sections = NSIndexSet(indexesIn: range)
            self.tableView.reloadSections(sections as IndexSet, with: .automatic)
            self.logger.info("Alan: Table updated successfully!")
        }
    }
    
    
    func requestEntities(completionHandler: @escaping (Error?) -&amp;amp;amp;amp;amp;amp;gt; Void) {
        self.loadEntitiesBlock!() { entities, error in
            if let error = error {
                completionHandler(error)
                return
            }
            
            self.entities = entities!
            self.allEntities.append(contentsOf: entities!)
            
            let encoder = JSONEncoder()
            if let encodedEntityValue = try? encoder.encode(self.entities) {
                if let json = String(data: encodedEntityValue, encoding: .utf8) {
                    print(json)
                    if let window = UIApplication.shared.keyWindow {
                        window.call(method: "script::updateProductEntities", params: ["json": json] , callback: { (error, result) in
                        })
                    }
                }
            }
            
            completionHandler(nil)
        }
    }

    // MARK: - Segues

    override func prepare(for segue: UIStoryboardSegue, sender _: Any?) {
        if segue.identifier == "showDetail" {
            // Show the selected Entity on the Detail view
            guard let indexPath = self.tableView.indexPathForSelectedRow else {
                return
            }
            self.logger.info("Showing details of the chosen element.")
            let selectedEntity = self.entities[indexPath.row]
            let detailViewController = segue.destination as! ProductDetailViewController
            detailViewController.entity = selectedEntity
            detailViewController.navigationItem.leftItemsSupplementBackButton = true
            detailViewController.navigationItem.title = self.entities[(self.tableView.indexPathForSelectedRow?.row)!].productID ?? ""
            detailViewController.allowsEditableCells = false
            detailViewController.tableUpdater = self
            detailViewController.preventNavigationLoop = self.preventNavigationLoop
            detailViewController.espmContainer = self.espmContainer
            detailViewController.entitySetName = self.entitySetName
        } else if segue.identifier == "addEntity" {
            // Show the Detail view with a new Entity, which can be filled to create on the server
            self.logger.info("Showing view to add new entity.")
            let dest = segue.destination as! UINavigationController
            let detailViewController = dest.viewControllers[0] as! ProductDetailViewController
            detailViewController.title = NSLocalizedString("keyAddEntityTitle", value: "Add Entity", comment: "XTIT: Title of add new entity screen.")
            let doneButton = UIBarButtonItem(barButtonSystemItem: .done, target: detailViewController, action: #selector(detailViewController.createEntity))
            detailViewController.navigationItem.rightBarButtonItem = doneButton
            let cancelButton = UIBarButtonItem(title: NSLocalizedString("keyCancelButtonToGoPreviousScreen", value: "Cancel", comment: "XBUT: Title of Cancel button."), style: .plain, target: detailViewController, action: #selector(detailViewController.cancel))
            detailViewController.navigationItem.leftBarButtonItem = cancelButton
            detailViewController.allowsEditableCells = true
            detailViewController.tableUpdater = self
            detailViewController.espmContainer = self.espmContainer
            detailViewController.entitySetName = self.entitySetName
        }
    }

    // MARK: - Image loading

    private func image(for indexPath: IndexPath, product: Product) -&amp;amp;amp;amp;amp;amp;gt; UIImage? {
        if let image = self.entityImages[indexPath.row] {
            return image
        } else {
            espmContainer.downloadMedia(entity: product, completionHandler: { data, error in
                if let error = error {
                    self.logger.error("Download media failed. Error: \(error)", error: error)
                    return
                }
                guard let data = data else {
                    self.logger.info("Media data is empty.")
                    return
                }
                if let image = UIImage(data: data) {
                    // store the downloaded image
                    self.entityImages[indexPath.row] = image
                    // update the cell
                    DispatchQueue.main.async {
                        self.tableView.beginUpdates()
                        if let cell = self.tableView.cellForRow(at: indexPath) as? FUIObjectTableViewCell {
                            cell.detailImage = image
                            cell.detailImageView.contentMode = .scaleAspectFit
                        }
                        self.tableView.endUpdates()
                    }
                }
            })
            return nil
        }
    }

    // MARK: - Table update

    func updateTable() {
        self.showFioriLoadingIndicator()
        DispatchQueue.global().async {
            self.loadData {
                self.hideFioriLoadingIndicator()
            }
        }
    }

    private func loadData(completionHandler: @escaping () -&amp;amp;amp;amp;amp;amp;gt; Void) {
        self.requestEntities { error in
            defer {
                completionHandler()
            }
            if let error = error {
                AlertHelper.displayAlert(with: NSLocalizedString("keyErrorLoadingData", value: "Loading data failed!", comment: "XTIT: Title of loading data error pop up."), error: error, viewController: self)
                self.logger.error("Could not update table. Error: \(error)", error: error)
                return
            }
            DispatchQueue.main.async {
                self.tableView.reloadData()
                self.logger.info("Table updated successfully!")
            }
        }
    }

    @objc func refresh() {
        DispatchQueue.global().async {
            self.loadData {
                DispatchQueue.main.async {
                    self.refreshControl?.endRefreshing()
                }
            }
        }
    }
}

extension ProductMasterViewController: EntitySetUpdaterDelegate {
    func entitySetHasChanged() {
        self.updateTable()
    }
}
[/code]
[code language="objc" collapse="true" title="CollectionsViewController.swift" highlight="20,36-91,107-110"]
//
// AlanDeliveries
//
// Created by SAP Cloud Platform SDK for iOS Assistant application on 24/04/19
//

import Foundation
import SAPFiori
import SAPFioriFlows
import SAPOData

protocol EntityUpdaterDelegate {
    func entityHasChanged(_ entity: EntityValue?)
}

protocol EntitySetUpdaterDelegate {
    func entitySetHasChanged()
}

class CollectionsViewController: FUIFormTableViewController, NavigateViewDelegate {
    private var collections = CollectionType.all

    // Variable to store the selected index path
    private var selectedIndex: IndexPath?

    private let okTitle = NSLocalizedString("keyOkButtonTitle",
                                            value: "OK",
                                            comment: "XBUT: Title of OK button.")

    var isPresentedInSplitView: Bool {
        return !(self.splitViewController?.isCollapsed ?? true)
    }

    // Navigate
    
    func navigateBack() {
        DispatchQueue.main.async {
            if let navigation1 = self.splitViewController?.viewControllers.last as? UINavigationController {
                if let navigation2 = navigation1.viewControllers.last as? UINavigationController {
                    if navigation2.viewControllers.count &amp;amp;amp;amp;amp;lt; 2 {
                        navigation1.popViewController(animated: true)
                    }
                    else {
                        if let last = navigation2.viewControllers.last {
                            last.navigationController?.popViewController(animated: true)
                        }
                    }
                }
            }
        }
    }
    
    func navigateCategory(_ category: String) {
        var indexPath = IndexPath(row: 0, section: 0)
        if( category == "Sales") {
            indexPath = IndexPath(row: 6, section: 0)
        }
        else if( category == "PurchaseOrderItems") {
            indexPath = IndexPath(row: 3, section: 0)
        }
        else if( category == "ProductText") {
            indexPath = IndexPath(row: 2, section: 0)
        }
        else if( category == "PurchaseOrderHeaders") {
            indexPath = IndexPath(row: 4, section: 0)
        }
        else if( category == "Supplier") {
            indexPath = IndexPath(row: 0, section: 0)
        }
        else if( category == "Product") {
            indexPath = IndexPath(row: 9, section: 0)
        }
        else if( category == "Stock") {
            indexPath = IndexPath(row: 5, section: 0)
        }
        else if( category == "ProductCategory") {
            indexPath = IndexPath(row: 1, section: 0)
        }
        else if( category == "SalesOrder") {
            indexPath = IndexPath(row: 8, section: 0)
        }
        else if( category == "Customer") {
            indexPath = IndexPath(row: 7, section: 0)
        }
        DispatchQueue.main.async {
            self.navigationController?.popToRootViewController(animated: true)
            DispatchQueue.main.asyncAfter(deadline: .now() + 0.5) {
                self.collectionSelected(at: indexPath)
            }
        }
    }
    
    // MARK: - Lifecycle

    override func viewDidLoad() {
        super.viewDidLoad()
        self.preferredContentSize = CGSize(width: 320, height: 480)

        self.tableView.rowHeight = UITableView.automaticDimension
        self.tableView.estimatedRowHeight = 44
    }

    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)
        self.makeSelection()
        
        if let window = UIApplication.shared.keyWindow {
            window.setVisual(["screen": "Main"])
            window.navigateViewDelegate = self
        }
    }

    override func viewWillTransition(to _: CGSize, with coordinator: UIViewControllerTransitionCoordinator) {
        coordinator.animate(alongsideTransition: nil, completion: { _ in
            let isNotInSplitView = !self.isPresentedInSplitView
            self.tableView.visibleCells.forEach { cell in
                // To refresh the disclosure indicator of each cell
                cell.accessoryType = isNotInSplitView ? .disclosureIndicator : .none
            }
            self.makeSelection()
        })
    }

    // MARK: - UITableViewDelegate

    override func numberOfSections(in _: UITableView) -&amp;amp;amp;amp;amp;gt; Int {
        return 1
    }

    override func tableView(_: UITableView, numberOfRowsInSection _: Int) -&amp;amp;amp;amp;amp;gt; Int {
        return collections.count
    }

    override func tableView(_: UITableView, heightForRowAt _: IndexPath) -&amp;amp;amp;amp;amp;gt; CGFloat {
        return 44
    }

    override func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -&amp;amp;amp;amp;amp;gt; UITableViewCell {
        let cell = tableView.dequeueReusableCell(withIdentifier: FUIObjectTableViewCell.reuseIdentifier, for: indexPath) as! FUIObjectTableViewCell
        cell.headlineLabel.text = self.collections[indexPath.row].rawValue
        cell.accessoryType = !self.isPresentedInSplitView ? .disclosureIndicator : .none
        cell.isMomentarySelection = false
        return cell
    }

    override func tableView(_: UITableView, didSelectRowAt indexPath: IndexPath) {
        self.collectionSelected(at: indexPath)
    }

    // CollectionType selection helper
    private func collectionSelected(at indexPath: IndexPath) {
        // Load the EntityType specific ViewController from the specific storyboard"
        var masterViewController: UIViewController!
        guard let espmContainer = OnboardingSessionManager.shared.onboardingSession?.odataController.espmContainer else {
            AlertHelper.displayAlert(with: "OData service is not reachable, please onboard again.", error: nil, viewController: self)
            return
        }
        self.selectedIndex = indexPath

        switch self.collections[indexPath.row] {
        case .suppliers:
            let supplierStoryBoard = UIStoryboard(name: "Supplier", bundle: nil)
            let supplierMasterViewController = supplierStoryBoard.instantiateViewController(withIdentifier: "SupplierMaster") as! SupplierMasterViewController
            supplierMasterViewController.espmContainer = espmContainer
            supplierMasterViewController.entitySetName = "Suppliers"
            func fetchSuppliers(_ completionHandler: @escaping ([Supplier]?, Error?) -&amp;amp;amp;amp;amp;gt; Void) {
                // Only request the first 20 values. If you want to modify the requested entities, you can do it here.
                let query = DataQuery().selectAll().top(20)
                do {
                    espmContainer.fetchSuppliers(matching: query, completionHandler: completionHandler)
                }
            }
            supplierMasterViewController.loadEntitiesBlock = fetchSuppliers
            supplierMasterViewController.navigationItem.title = "Supplier"
            masterViewController = supplierMasterViewController
        case .productCategories:
            let productCategoryStoryBoard = UIStoryboard(name: "ProductCategory", bundle: nil)
            let productCategoryMasterViewController = productCategoryStoryBoard.instantiateViewController(withIdentifier: "ProductCategoryMaster") as! ProductCategoryMasterViewController
            productCategoryMasterViewController.espmContainer = espmContainer
            productCategoryMasterViewController.entitySetName = "ProductCategories"
            func fetchProductCategories(_ completionHandler: @escaping ([ProductCategory]?, Error?) -&amp;amp;amp;amp;amp;gt; Void) {
                // Only request the first 20 values. If you want to modify the requested entities, you can do it here.
                let query = DataQuery().selectAll().top(20)
                do {
                    espmContainer.fetchProductCategories(matching: query, completionHandler: completionHandler)
                }
            }
            productCategoryMasterViewController.loadEntitiesBlock = fetchProductCategories
            productCategoryMasterViewController.navigationItem.title = "ProductCategory"
            masterViewController = productCategoryMasterViewController
        case .productTexts:
            let productTextStoryBoard = UIStoryboard(name: "ProductText", bundle: nil)
            let productTextMasterViewController = productTextStoryBoard.instantiateViewController(withIdentifier: "ProductTextMaster") as! ProductTextMasterViewController
            productTextMasterViewController.espmContainer = espmContainer
            productTextMasterViewController.entitySetName = "ProductTexts"
            func fetchProductTexts(_ completionHandler: @escaping ([ProductText]?, Error?) -&amp;amp;amp;amp;amp;gt; Void) {
                // Only request the first 20 values. If you want to modify the requested entities, you can do it here.
                let query = DataQuery().selectAll().top(20)
                do {
                    espmContainer.fetchProductTexts(matching: query, completionHandler: completionHandler)
                }
            }
            productTextMasterViewController.loadEntitiesBlock = fetchProductTexts
            productTextMasterViewController.navigationItem.title = "ProductText"
            masterViewController = productTextMasterViewController
        case .purchaseOrderItems:
            let purchaseOrderItemStoryBoard = UIStoryboard(name: "PurchaseOrderItem", bundle: nil)
            let purchaseOrderItemMasterViewController = purchaseOrderItemStoryBoard.instantiateViewController(withIdentifier: "PurchaseOrderItemMaster") as! PurchaseOrderItemMasterViewController
            purchaseOrderItemMasterViewController.espmContainer = espmContainer
            purchaseOrderItemMasterViewController.entitySetName = "PurchaseOrderItems"
            func fetchPurchaseOrderItems(_ completionHandler: @escaping ([PurchaseOrderItem]?, Error?) -&amp;amp;amp;amp;amp;gt; Void) {
                // Only request the first 20 values. If you want to modify the requested entities, you can do it here.
                let query = DataQuery().selectAll().top(20)
                do {
                    espmContainer.fetchPurchaseOrderItems(matching: query, completionHandler: completionHandler)
                }
            }
            purchaseOrderItemMasterViewController.loadEntitiesBlock = fetchPurchaseOrderItems
            purchaseOrderItemMasterViewController.navigationItem.title = "PurchaseOrderItem"
            masterViewController = purchaseOrderItemMasterViewController
        case .purchaseOrderHeaders:
            let purchaseOrderHeaderStoryBoard = UIStoryboard(name: "PurchaseOrderHeader", bundle: nil)
            let purchaseOrderHeaderMasterViewController = purchaseOrderHeaderStoryBoard.instantiateViewController(withIdentifier: "PurchaseOrderHeaderMaster") as! PurchaseOrderHeaderMasterViewController
            purchaseOrderHeaderMasterViewController.espmContainer = espmContainer
            purchaseOrderHeaderMasterViewController.entitySetName = "PurchaseOrderHeaders"
            func fetchPurchaseOrderHeaders(_ completionHandler: @escaping ([PurchaseOrderHeader]?, Error?) -&amp;amp;amp;amp;amp;gt; Void) {
                // Only request the first 20 values. If you want to modify the requested entities, you can do it here.
                let query = DataQuery().selectAll().top(20)
                do {
                    espmContainer.fetchPurchaseOrderHeaders(matching: query, completionHandler: completionHandler)
                }
            }
            purchaseOrderHeaderMasterViewController.loadEntitiesBlock = fetchPurchaseOrderHeaders
            purchaseOrderHeaderMasterViewController.navigationItem.title = "PurchaseOrderHeader"
            masterViewController = purchaseOrderHeaderMasterViewController
        case .stock:
            let stockStoryBoard = UIStoryboard(name: "Stock", bundle: nil)
            let stockMasterViewController = stockStoryBoard.instantiateViewController(withIdentifier: "StockMaster") as! StockMasterViewController
            stockMasterViewController.espmContainer = espmContainer
            stockMasterViewController.entitySetName = "Stock"
            func fetchStock(_ completionHandler: @escaping ([Stock]?, Error?) -&amp;amp;amp;amp;amp;gt; Void) {
                // Only request the first 20 values. If you want to modify the requested entities, you can do it here.
                let query = DataQuery().selectAll().top(20)
                do {
                    espmContainer.fetchStock(matching: query, completionHandler: completionHandler)
                }
            }
            stockMasterViewController.loadEntitiesBlock = fetchStock
            stockMasterViewController.navigationItem.title = "Stock"
            masterViewController = stockMasterViewController
        case .salesOrderItems:
            let salesOrderItemStoryBoard = UIStoryboard(name: "SalesOrderItem", bundle: nil)
            let salesOrderItemMasterViewController = salesOrderItemStoryBoard.instantiateViewController(withIdentifier: "SalesOrderItemMaster") as! SalesOrderItemMasterViewController
            salesOrderItemMasterViewController.espmContainer = espmContainer
            salesOrderItemMasterViewController.entitySetName = "SalesOrderItems"
            func fetchSalesOrderItems(_ completionHandler: @escaping ([SalesOrderItem]?, Error?) -&amp;amp;amp;amp;amp;gt; Void) {
                // Only request the first 20 values. If you want to modify the requested entities, you can do it here.
                let query = DataQuery().selectAll().top(20)
                do {
                    espmContainer.fetchSalesOrderItems(matching: query, completionHandler: completionHandler)
                }
            }
            salesOrderItemMasterViewController.loadEntitiesBlock = fetchSalesOrderItems
            salesOrderItemMasterViewController.navigationItem.title = "SalesOrderItem"
            masterViewController = salesOrderItemMasterViewController
        case .customers:
            let customerStoryBoard = UIStoryboard(name: "Customer", bundle: nil)
            let customerMasterViewController = customerStoryBoard.instantiateViewController(withIdentifier: "CustomerMaster") as! CustomerMasterViewController
            customerMasterViewController.espmContainer = espmContainer
            customerMasterViewController.entitySetName = "Customers"
            func fetchCustomers(_ completionHandler: @escaping ([Customer]?, Error?) -&amp;amp;amp;amp;amp;gt; Void) {
                // Only request the first 20 values. If you want to modify the requested entities, you can do it here.
                let query = DataQuery().selectAll().top(20)
                do {
                    espmContainer.fetchCustomers(matching: query, completionHandler: completionHandler)
                }
            }
            customerMasterViewController.loadEntitiesBlock = fetchCustomers
            customerMasterViewController.navigationItem.title = "Customer"
            masterViewController = customerMasterViewController
        case .salesOrderHeaders:
            let salesOrderHeaderStoryBoard = UIStoryboard(name: "SalesOrderHeader", bundle: nil)
            let salesOrderHeaderMasterViewController = salesOrderHeaderStoryBoard.instantiateViewController(withIdentifier: "SalesOrderHeaderMaster") as! SalesOrderHeaderMasterViewController
            salesOrderHeaderMasterViewController.espmContainer = espmContainer
            salesOrderHeaderMasterViewController.entitySetName = "SalesOrderHeaders"
            func fetchSalesOrderHeaders(_ completionHandler: @escaping ([SalesOrderHeader]?, Error?) -&amp;amp;amp;amp;amp;gt; Void) {
                // Only request the first 20 values. If you want to modify the requested entities, you can do it here.
                let query = DataQuery().selectAll().top(20)
                do {
                    espmContainer.fetchSalesOrderHeaders(matching: query, completionHandler: completionHandler)
                }
            }
            salesOrderHeaderMasterViewController.loadEntitiesBlock = fetchSalesOrderHeaders
            salesOrderHeaderMasterViewController.navigationItem.title = "SalesOrderHeader"
            masterViewController = salesOrderHeaderMasterViewController
        case .products:
            let productStoryBoard = UIStoryboard(name: "Product", bundle: nil)
            let productMasterViewController = productStoryBoard.instantiateViewController(withIdentifier: "ProductMaster") as! ProductMasterViewController
            productMasterViewController.espmContainer = espmContainer
            productMasterViewController.entitySetName = "Products"
            func fetchProducts(_ completionHandler: @escaping ([Product]?, Error?) -&amp;amp;amp;amp;amp;gt; Void) {
                // Only request the first 20 values. If you want to modify the requested entities, you can do it here.
                
                let query = DataQuery().selectAll().top(20)
                do {
                    espmContainer.fetchProducts(matching: query, completionHandler: completionHandler)
                }
            }
            productMasterViewController.loadEntitiesBlock = fetchProducts
            productMasterViewController.navigationItem.title = "Product"
            masterViewController = productMasterViewController
        case .none:
            masterViewController = UIViewController()
        }

        // Load the NavigationController and present with the EntityType specific ViewController
        let mainStoryBoard = UIStoryboard(name: "Main", bundle: nil)
        let rightNavigationController = mainStoryBoard.instantiateViewController(withIdentifier: "RightNavigationController") as! UINavigationController
        rightNavigationController.viewControllers = [masterViewController]
        self.splitViewController?.showDetailViewController(rightNavigationController, sender: nil)
    }

    // MARK: - Handle highlighting of selected cell

    private func makeSelection() {
        if let selectedIndex = selectedIndex {
            tableView.selectRow(at: selectedIndex, animated: true, scrollPosition: .none)
            tableView.scrollToRow(at: selectedIndex, at: .none, animated: true)
        } else {
            selectDefault()
        }
    }

    private func selectDefault() {
        // Automatically select first element if we have two panels (iPhone plus and iPad only)
        if self.splitViewController!.isCollapsed || OnboardingSessionManager.shared.onboardingSession?.odataController.espmContainer == nil {
            return
        }
        let indexPath = IndexPath(row: 0, section: 0)
        self.tableView.selectRow(at: indexPath, animated: true, scrollPosition: .middle)
        self.collectionSelected(at: indexPath)
    }
}

[/code]

Once you’ve added your handlers in Xcode, save and build the application.

Test a few of the voice commands:

  • Open products
  • What products do you have?
  • Show notebooks less than 1500 euros
  • What’s the price of the Notebook Basic 15?

And that concludes our integration and Visual Voice experience for this SAP sample application. This application was created as part of Alan and SAP’s partnership to voice enable the enterprise. Here’s a full video on the integration. For more details, please check out Alan’s documentation here.

Feel free to provide your feedback or just ask about support via sergey@alan.app

]]>
https://alan.app/blog/add-a-visual-voice-experience-to-your-sap-mobile-applications-with-alan/feed/ 0 2041
Taking your Voice Interface to the Office https://alan.app/blog/taking-your-voice-assistant-to-the-office/ https://alan.app/blog/taking-your-voice-assistant-to-the-office/#comments Mon, 18 Dec 2017 20:24:33 +0000 https://alan.app/blog/?p=1407 1_oP1tuzGRVZS-JRkMIHwk3A

Voice assistants have been finding their way into our daily lives. Among at home voice-enabled speaker devices, Amazon has taken a 70% market foothold and sold over 25 million units, while Google trails behind at 23.8% market share and selling 5 million units. These devices, along with their mobile counterparts Siri, Cortana, and Google Assistant, have pushed voice into the mainstream, making consumers more comfortable using them as a hands-free way of getting things done.

Still, only 46% of U.S. adults use voice assistants, with the most frequent use cases being playing music, setting alarms, and checking the weather. The remaining 54% of U.S. adults say they don’t use voice assistants simply because they are not interested. And it’s no wonder, as several of the most common use cases for voice can easily be achieved with a few taps on the phone — not to mention Siri and Google’s command and response can be easily bungled, taking more time than just tapping through to the desired app.

Each tech giant is using their own strengths to approach the advantages voice enables: Amazon’s Echo is used for shopping, Apple is launching their own Home Pod early 2018, which doubles down on music, and Google is optimizing their own Home device for search. So far, Amazon seems to be the only one to deliver on the advantages of voice, letting consumers say something as simple as “Amazon order X”, which automatically completes the order and has the items shipped.

Traditional User Interfaces are complex, and completing tasks or performing a search is time-consuming. With voice, tasks can be completed in only a few words. Today for example, if you wanted to find the recent changes to a document, you would first have to open your docs app, search for the document, open it, then click on a separate button to view the recent changes. With voice, you could simply say “Are there any recent changes to the document?” and the result will be spoken back to you.

At work, we often use business applications that require training and practice to use effectively within our organization. These products help us solve complex problems, but shouldn’t require us to jump through hoops. As Steve Jobs said, “the computer is the bicycle for the mind”. Our software at work should be intelligent to help us go faster, not slow us down. With voice, we’ll be able to go faster with less effort. This is the future of voice we’re building at Alan.

]]>
https://alan.app/blog/taking-your-voice-assistant-to-the-office/feed/ 1 1407