In this article, we’ll discuss how to create a Node.js REST API and host this service in Azure. I’ll cover the following key steps:
- Creating a SQL database in Azure
- Creating a Node.js application using Hapi, Tedious and mssql
- Deploying the Node.js application via Visual Studio Online
Creating a SQL database in Azure
There are currently two different portals to manage resources in Azure. The original portal is https://manage.windowsazure.com, and the new Azure portal can be found at https://portal.azure.com. Unfortunately at this point, not everything can be managed from either one. For this article, we’ll use the new Azure portal.
Open the new Azure portal and log in. From the left navigation, select SQL databases. This will show a list of all existing databases in your Azure tenant. Click Add at the top to start the creation of a new database.
We’ll create a database with the name StarWarsHeroesDB. Then, select the server setting to either create a new server or use an existing server. If you’re creating a new server, you’ll need to set up the username and password to access the server.
Make sure “Allow Azure services to access server” is selected so that our Node.js application will be able to access the database. Under Select source, choose the source of the database. In our case, we’ll use Blank database to start from scratch. Make sure to update the pricing tier and other settings as you see fit.
Once the database has been created via the Azure portal, you’ll need to configure firewall settings so that both your computer and your Node.js application will have access to the database. Select the StarWarsHeroesDB either from the portal homepage or under the SQL databases section. Click the server name to open the server settings, and then click Show firewall settings to open the Firewall settings section.
Make sure Allow access to Azure services is set to On so that your Node.js app can access the database. Also, you’ll want to add your client IP address to access the database from your computer when debugging your Node.js app locally and via SQL Server tools. Note: If you switch networks (e.g., from work to home, etc.), you’ll need to update these settings to include the new IP address.
To manage the database, download SQL Server Management Studio. This is currently in SQL Server Management Studio January 2016 preview. Note that using the SQL Server Management Studio that comes as part of SQL Server 2014 will display errors when performing certain tasks, such as viewing stored procedures.
To connect to the database, use the server name (e.g., server.database.windows.net) with SQL Server Authentication using the username and password for access. Creating a separate SQL Server account for your application and granting access to only the resources needed for the application to run is highly recommended for production solutions.
For our solution, we’ll create a table called “Heroes” with an identifier field, a name field and two Boolean fields. Once we enter our data, we have something like the following:
Now we’ll create several stored procedures to be used by our Node.js application.
The first stored procedure will return a list of all Heroes.
The second stored procedure will take a Boolean (or Bit in SQL Server) and return either good heroes or bad heroes.
The third stored procedure will take an integer and return the Hero that has an ID value matching the integer.
Now that we have our database created, we’ll turn our attention to the Node.js application.
Creating a Node.js application using Hapi, Tedious and mssql
Our next step is to create a Node.js application. We’ll only cover the most important parts of the solution in the article, but full source code is available for download at the end of the article.
To begin, we create a directory called StarWarsHeroesAPI, and then open a command window in this directory. Enter “npm init” into the command window to create the package.json file. Change the entry point to server.js (instead of index.js); all other defaults are acceptable.
To add all of the packages we’ll use for our solution, run the following command from the command window:
npm install boom hapi joi mssql tedious --save
The “--save” portion saves the dependencies to the package.json file. A new section similar to the following should now be in your package.json file.
These dependencies we’ve added meet the following needs:
- Boom — provides a set of utilities for returning HTTP errors
- Hapi — framework for building applications and services, based on configuration rather than code, created by WalmartLabs
- Joi — for object schema validation
- Mssql — used with Tedious, offers connection pooling and built-in JSON serialization
- Tedious — implementation of TDS, used to connect to SQL Server
We need to add one more section at this point to our package.json file to ensure the version of Node is at least version 4. This is a requirement for Hapi to work properly.
Now it’s time to start building out the code in our application. Create a new directory titled lib. In this new directory, create the following files:
- dataStore.js — to handle the connection and queries to SQL Server
- handlers.js — will contain the functions called by the router
- routes.js — will contain the definition of our routes and the names of the functions to call when a route is hit
Starting with the dataStore.js file, we’ll add the functionality for the connection string.
Note the use of the process.env.DBUser value. This retrieves the value of the DBUser configuration setting from Azure. We’re including the “or” section only for testing locally. Ideally, your database for testing won’t have the same credentials or server name as your deployed database.
After we retrieve the user, password and server from Azure configuration settings, we create the config variable, which will be used as the connection string by mssql. The encrypt setting must be set to true if you’re connecting to SQL Server in Azure.
Now we come to calling the AllHeroes stored procedure to retrieve the data.
Let’s look through the code to note some important points. On line 20, you can see we use the config variable previously defined to make the connection to the database. Note that we always need to check for errors whenever we run anything against the database (lines 21 and 28).
Once we have a connection, we create a new request and execute it. When we get the data, we use the callback passed in when the method is called to return the data back to the handler.
To call the HeroesByIntent stored procedure, we need to pass in an additional parameter into the DataStore.getHeroesByIntent function with the parameter value to determine whether we want good or bad heroes. The additional parameter is a Boolean value, which mssql will convert to the SQL Server bit type. To add the input parameter to the call, we add in a request.input call just prior to executing the request. The code looks like the following:
To call the HeroesById stored procedure, we do the same as above, only changing the request.input to take an integer value instead of a bit value.
Now we move onto the handlers.js file. We start out by importing the dataStore file, as well as including Boom to assist us with error messages.
We’ll create three functions that will be used as handler functions by our routes. The first is the getAllHeroes function.
Note that we call the getAllHeroes function we created in the dataStore.js file. The parameter for the getAllHeroes function is our callback, which runs when the query is completed. If an error occurs, we use Boom to pass back an error message with the HTTP error. If the query is successful, an array is passed back via the reply. Note: The first item in the results array will contain the results of our query within the stored procedure.
The same basic idea occurs when we call the getHeroesByIntent. We use request.params to return the parameter we’ll define within the route.
When calling getHeroById, we know the ID field is a unique field and only one result will be returned. Because an array is returned, and we only want a singular item, we return the first item of the results.
Now we’ll use the routes.js file to define the routes for our application.
First, we include Joi for validation and import our handlers file.
Next, we create a variable Routes. This array value will contain the configuration of our routes along with validation.
Looking at the first route defined (lines 5–9 above), we define the URL, the method as GET (other options would include POST, PUT, DELETE, etc.) and the handler to call when the route is hit. The handler is the function we created in our handlers.js file.
In the second and third routes, we use a named parameter (good on line 11, and heroId on line 23) to pass values from the URL to the handler where they’re retrieved via request.params in the handler. We also use Joi to validate the value passed into the parameter. The good parameter is validated to make sure it’s a Boolean (line 17) and the heroId parameter is validated to make sure it’s an integer greater than zero.
For the good parameter, the router and Joi allow “true,” “on” or “yes” to be passed in for true, and “false,” “off” or “no” to be passed via the route for false. If you enter 0 or 1, you’ll get an error message:
Finally, we need to create our server.js file in the root directory of our application. Our server.js file looks like the following:
After requiring Hapi, we create the Hapi.Server, determine the port and create the connection for the server.
Make note of process.env.port (line 5 above). In Azure, this will receive the port environment variable via a named pipe (which looks something like \\.\pipe\longGuid). Don’t convert this to a number prior to the OR evaluation; otherwise, the value of the port variable will be 3,000. This would prevent the Node.js application from running on the port expected by Azure.
We import our routes defined in routes.js and start the server.
You’ll also need to add the web.config in the included source code, which tells Azure to use iisnode as the web server and points to our server.js as the entry point of our application.
To test our application locally, we can run “node server.js” from our command window. Then open a browser and go to http://localhost:3000/heroes to see the response.
Deploying the Node.js application via Visual Studio Online
We’ll start by including gulp and configuring gulp to package our solution. First, we go to the command window in the root of our solution and type:
npm install gulp gulp-zip minimist --save-dev
This will add gulp, gulp-zip and minimist as devDependencies in the package.json file.
Create a gulpfile.js file in the root directory. Copy the code from the attached source code. We won’t cover the details around the code in the gulp file, but when gulp runs, the code from the application is zipped up for deployment to Azure without including unneeded items such as the node_modules directory, gulpfile.js, etc.
Next, we need to add our code to a Visual Studio Online project. We won’t fully cover this here, but we need to create a project with Git as the source control. If you haven’t set it up before, you’ll need to create Git credentials within Visual Studio Online.
Once our code is fully committed to our Visual Studio Online Git repository, we’ll begin creating our build within Visual Studio Online (these steps are based on a Microsoft article). Click on the Build link at the top within our Visual Studio Online project.
Then click the plus button to add a new build definition.
In the Create a new build definition window, click Empty and then Next to start your build definition. Select our repository source (i.e., the Visual Studio Online project you created) and a default branch to deploy.
We’ll select the master default branch, but you could easily create additional build definitions using different branches and Azure web apps for development or test environments. Click Create when you’ve made the proper selections.
This will create our New Empty definition 1.
We’ll add multiple build steps to handle the deployment by clicking on Add build step. Add the following steps:
- npm under Package
- Gulp under Build
- Azure Web App Deployment under Deploy
- Publish Build Artifacts under Utilities
After you’ve added the build tasks, your screen should look like the following:
The defaults under the npm build step are acceptable. Click on the Gulp build step and make sure the settings are made as follows:
Click on the Azure Web App Deployment build step. If you don’t have an Azure subscription listed in the drop-down, click the Manage button to add a new one. In the new window, click on New Service Endpoint, then Azure. Switch to Certificate Based.
To get the certificate, download the publish settings file by clicking on the link at the bottom. A new window will open, and you may be prompted for your Azure login. Upon completion of the generation for your subscription file, the file will be downloaded. Give your connection a name, then use the downloaded publish settings file to make the following updates:
- Subscription Id — Use the Id value in the file.
- Subscription Name — Use the Name value in the file.
- Management Certificate — Use the ManagementCertificate value in the file (this will be very long).
Click OK to save. Return to the browser window with your build steps. Click the Refresh button next to Azure Subscription, then select the subscription you just created. Enter a web app name. If you already have a web app created you want to use, enter it in the Web App Name field, or use a new unique name to create a new web app. Enter “$(Build.StagingDirectory)/package.zip” into the Web Deploy Package field.
Now click on the Publish Build Artifacts build step. Enter the values as displayed below:
Click the Repository tab and make sure the correct Repository and branch are selected.
Click the Triggers tab. Select Continues integration if you want the build to occur automatically after each check-in, or select Scheduled if you want to build on a set schedule. If neither option meets your needs, you can manually deploy your build on demand.
Click Save and enter a deployment name. Click Queue build to start the build process. You’ll see a console window appear and display your build status. The build and deployment may take five or 10 minutes.
Once completed, open the Azure portal. Select your newly deployed application and click on Application Settings.
Create the following App Settings for connection to the SQL Server database.
- DBUser — the name of the user
- DBPassword — the password for the user
- DBServer — the name of the server in Azure where the database is located
When completed, go to your website based on our name created to see the list of heroes in our database.
Hopefully, your application has been deployed and has no issues. Often, however, issues arise in our applications.
First, check that the node version in Azure matches the version you expect. To check the current running version, from the Azure portal, select your web application. Then, select Tools. Scroll down and select Console. This will open a console window where you can enter commands. Type “node –version” to get your version.
To determine versions of node available on your Azure environment, from the tools section, select Kudu and then Go. This will open a new browser window, which will point to https://webname.scm.azurewebsites.net/ (where webname is the name you gave your application). Then click Runtime versions to see a list of available versions.
If you want to update the version of Node for your web application, go to the Settings page for your application, then click Application Settings. Update the WEBSITE_NODE_DEFAULT_VERSION setting to the version of node you want to run. Make sure to select one from the list of available versions or node may revert to a version you don’t want. As an example, when I entered a version not listed (4.2.5), node reverted to version 0.10.28, which doesn’t work with Hapi.
To get debug messages in the browser, include an iisnode.yml file in your root folder. Include the following two lines:
Debug logs can be viewed by opening the site in FTP. The FTP hostname and FTP username can be viewed by selecting the web application in the Azure portal. Log files are under the LogFiles folder. These can give helpful error messages when something fails.
Hopefully, after all of these steps, you have a good, error-free Node.js application accessing data from SQL Server hosted in Azure. Although creating a Node.js application and deploying it to Azure are somewhat complex and involved, Azure offers a very good hosting environment for Node.js applications.
This article originally appeared on Mar. 29, 2016.