Alexa Skills can be developed using Alexa Lambda functions or a REST API endpoint. Lambda functions are Amazon’s implementation of serverless functions available in AWS. Amazon recommends using Lambda functions even though they are not easy to debug. While you can log to a CloudWatch log, you can’t hit a breakpoint and step into the code.

Alexa Skill With .NET Core

This makes live debugging of Alexa requests a challenge. This post explains a simple but useful solution: it is to wrap code in a .NET Standard class library and stand up a REST API project for debugging and development and a Lambda function project for AWS deployment. This article shows how to create an environment to debug a locally-hosted Web API that uses the same logic that is used by a Lambda function. Everything’s written in C#.

Structuring the Solution

This approach requires a minimum of two projects in the solution:

  1. Lambda function project (.NET Core 2.1).
  2. Web API project (.NET Core 2.1).


Here you have the technologies used in this project:

  1. .NET Core 2.1.
  2. Visual Studio Community 2019.
  3. Nugget Package Manager.
  4. Alexa .NET (Version 1.13.0).
  5. ngrok.

This article assumes familiarity with creating Lambda functions. The AWS Toolkit for Visual Studio includes Lambda function projects and can be used to create .NET Core Lambda functions. All meaningful logic should be contained in the BusinessLogic folder and referenced by both the Lambda function project and the Web API project. Both the Lambda project and the Web API project should be thin wrappers for the business logic.

Install Amazon.Lambda.Tools Global Tools if not already installed.

If already installed check if a new version is available.

Project Files

These are the main files of the project:

Project file structure

  • serverless.template — an AWS CloudFormation Serverless Application Model template file for declaring your serverless functions and other AWS resources.
  • aws-lambda-tools-defaults.json — default argument settings for use with Visual Studio and command line deployment tools for AWS.
  • Function.cs — class that derives from Amazon.Lambda.AspNetCoreServer.APIGatewayProxyFunction. The code in this file bootstraps the ASP.NET Core hosting framework. The Lambda function is defined in the base class.
  • LocalEntryPoint.cs — for local development this contains the executable Main function which bootstraps the ASP.NET Core hosting framework with Kestrel, as for typical ASP.NET Core applications.
  • Startup.cs — usual ASP.NET Core Startup class used to configure the services ASP.NET Core will use.
  • web.config — used for local development.
  • ControllersAlexaController.cs — Alexa API controller that will receive all the requests from the cloud. This controller allows us to debug our Skill locally.
  • BsuinessLogicAlexaProcessor.cs — The code that will process all the Alexa requests from the Lambda Function in Function.cs and from the POST controller in AlexaController.cs.

Once the structure has been explained, it is time to understand how it works in the entire project:

High-level application workflow

Alexa Request Processor

Alexa.NET is a helper library for working with Alexa Skill requests/responses in C#. Whether you are using the AWS Lambda service or hosting your own service on your server, this library aims just to make working with the Alexa API more natural for a C# developer using a strongly-typed object model.

You can find all the documentation in their official GitHub repository.

Below, you have the class that will manage all the Alexa requests using Alexa.NET Nugget package showing how easy is to develop these kinds of voice apps in .NET:

Build the Skill With Visual Studio

Visual Studio offers a lot of built-in functionality. If you want to build the whole solution within the IDE, you can do it interactively:

Visual Studio workbench

Run the Skill With Visual Studio

Run the Alexa Skill is as easy as click on the play button in Visual Studio. Using the configuration alexa_dotnet_lambda_helloworld:

Running Skill with VS Code

After executing it, you can send Alexa POST requests to http://localhost:5000/api/alexa.

This execution will run the web API project. If you want to run the Lambda function, you have to run the configuration Mock Lambda Test Tool:

Mock Lambda Test Tool

This execution will run a browser with the Mock Lambda Test Tool, and from there, you can execute all requests directly towards your Function.cs:

Mock Lambda Test Tool output

Debug the Skill With Visual Studio

Following the steps before, now you can set up breakpoints wherever you want inside all C# files in order to debug your Skill:

Debugging Skill

Test Requests Locally

I’m sure you already know the famous tool, Postman. REST APIs have become the new standard in providing a public and secure interface for your service. Though REST has become ubiquitous, it’s not always easy to test. Postman makes it easier to test and manage HTTP REST APIs. Postman gives us multiple features to import, test, and share APIs, which will help you and your team be more productive in the long run.

After running your application, you will have an endpoint available at http://localhost:5000/api/alexa. With Postman, you can emulate any Alexa Request.

For example, you can test a LaunchRequest:

You can execute unit tests as well

Deploy Your Alexa Skill

You can do it directly from Visual Studio or from your CLI:

Here Are Some Steps to Follow From Visual Studio:

To deploy your Serverless application, right-click the project in Solution Explorer and select Publish to AWS Lambda.

To view your deployed application, open the Stack View window by double-clicking the stack name shown beneath the AWS CloudFormation node in the AWS Explorer tree. The Stack View also displays the root URL to your published application.

Publishing to AWS Lambda

Getting Started From the Command Line

Once you have edited your template and code, you can deploy your application using the Amazon.Lambda.Tools Global Tool from the command line.

Deploy application:

Test Requests Directly From Alexa

Ngrok is a very cool, lightweight tool that creates a secure tunnel on your local machine, along with a public URL you can use for browsing your local site or APIs.

When ngrok is running, it listens on the same port that your local web server is running on and proxies external requests to your local machine

From there, it’s a simple step to get it to listen to your web server. Let’s say you’re running your local web server on port 5000. In the terminal, you’d type: ngrok http 5000. This starts ngrok listening on port 5000 and creates the secure tunnel:

Creating secure tunnel

Now, you have to go to Alexa Developer console, go to your skill > endpoints > https, add the HTTPS URL generated above followed by /api/alexa. Eg:

Select the My development endpoint as a sub-domain…. option from the dropdown and click save the endpoint at the top of the page.

Go to the Test tab in the Alexa Developer Console and launch your Skill.

The Alexa Developer Console will send an HTTPS request to the ngrok endpoint,(, which will route it to your Skill running on the Web API server at http://localhost:5000/api/alexa.


This example can be useful for all those developers who do not want to write their code in Java, NodeJS, or Pyhton. These are the official languages supported by Alexa. We have to take into account that Alexa.NET nugget package is not an Official Alexa Skill Kit (ASK). 

 I recommend you to access its official GitHub repository to check the status of the project, bugs, updates, etc. As you have seen in this example, community developers give you the possibility to create Skills in different ways. I hope this example project is useful to you.

That’s all, folks! You can find all the code in my GitHub.

I hope it will be useful! If you have any doubts or questions, do not hesitate to contact me or put a comment below!

Happy coding!

Source link

Write A Comment