Taswar Bhatti
The synonyms of software simplicity
Powershell-dpapi

I wanted to store some data into registry but how do I store it such that it is secure for my application to read later on. Well there is a way using Powershell to store data into Registry using DPAPI which uses the machine key to store the information. As you know the only way to store data securely is by using encryption and DPAPI provides us with that.

DPAPI only works in windows, if you are on linux environment there is no DPAPI

What is DPAPI?

From Wikipedia we get this definition

DPAPI (Data Protection Application Programming Interface) is a simple cryptographic application programming interface available as a built-in component in Windows 2000 and later versions of Microsoft Windows operating systems. In theory the Data Protection API can enable symmetric encryption of any kind of data; in practice, its primary use in the Windows operating system is to perform symmetric encryption of asymmetric private keys, using a user or system secret as a significant contribution of entropy.

Now that we know it uses the machine key we can write some code to store some value into our registry using powershell. We will be writing into the Wow64 registry since we will most likely use a 64bit application later on. If you need to write to 32bit also just remove the wow64node.

Powershell using DPAPI to store secure data in Registry

What we will do is use SecureString to read the data and convert the secure string into a hash, then we will base64 encode it and store it to the registry, one could just leave it as it and not base64 it. We use the New-ItemProperty to create the value into the registry.

powershell-execute-command

powershell-execute-command

So here it is how to store data into your registry using DPAPI. In my next post I will show how to read back the value using C#.

nodejs-vscode-docker-vault-logo

To continue on with our previous blog post, I will introduce Hashicorp Vault as a key management to manage our secrets for our Nodejs weather application.

Installing Vault

I will use docker to pull the docker image from dockerhub.

docker-install-vault

docker-install-vault

One can also download Vault for their OS at (https://www.vaultproject.io/downloads.html)

Now that I have vault image pulled, I will create a docker compose file for Vault to use mysql as a back-end store. I can also run Vault in dev mode but if I enable dev mode then Vault runs entirely in-memory and starts unsealed with a single unseal key. I wanted to show more of a real life scenario of starting Vault.

First thing I will create couple of directory and files that I will store some configuration into.

Now that we have a directory we can create a docker-compose.yml file inside of the myvault directory. I assume you already have mysql image, if not you call pull the image from dockerhub.

Before we fire up vault, here is the content of the config.hcl file, located right in the config folder we just created. This will configure Vault with the storage options and listen on port 8200

Running Vault

Use the docker compose file and run the following command to bring up Vault

If you mess up you can always run $docker-compose rm to remove the created containers.

Testing our installation

Now that we have Vault configured and running, we now need to initialize Vault. Vault uses Shamir Secret Sharing Technique for initializing the key to use for Vault. Which takes multiple keys and combines into one single master key.

shamir-secret-sharing-vault-unseal

shamir-secret-sharing-vault-unseal

Unseal Vault

We will use the operator init method to call Vault to initialize.

As we can see it showed us some of the keys we need to use to unseal the Vault. So let unseal it so that we can use it. We will need to unseal the vault with the command 3 times using different keys each time for it, I have skipped the first one.

Now we can use the Vault to write data to, first we need to auth with root token which was given to us when we started the Vault with the keys. My test envirnoment key was “5mEKu64nAk1PA5luVQHRmGLM”. Lets try to auth/login and also write and read something to Vault to store.

We can now also create a sample.json config file that a new Microservice can consume I have placed it in config folder since it was already mounted (note the file is in 1 line)

We can load the sample json data into Vault by using this command.

I am using the weatherapp as a example above

We will most probably also create a policy file so that we can limit the access to this secret, this will be inside the policies directory

In order to load the policy into Vault we will use the command to load the policy in

Now policy is in, lets test out Vault to read some of they values we just added.

Above we have just used the root token to read but at least we know we can get the data

Wrap Token

Vault has a nice feature called wrap token, where you can give a limited amount of time to a token which one can use to access Vault. Lets try to create one for us.

We can use the wrap token above “lYO2AoJ95QEDnZgUbNxoWWsw ” to read the data now, and it is only valid for 60 seconds.

If we try to use another token of if the token is already used we will get an error like below.

App Roles and Secrets

Now that we have learned something about Wrap tokens of how to create and use them, I wanted to switch gear and talk about App Roles and Secrets. Vault provides app roles for you application to login to the system. This is definitely not the best security option out there since its just a like a basic authentication with username and password, but when we use wrap token with it we can mitigate some of the security concerns.

There are better options out there using kubernetes to authenticate for you app etc I will try to cover those later in another blog post

Lets create us an approle for our weather app, and get the roleid that we can use for our application.

Sample Application

Now that we have the roleid we now need a secret in order for our application to login to Vault and get its secrets. Rather than just giving the secret to our application we will give it a wrap token to get the secret since we can time limit the amount of time the token is allowed to live, thus mitigating the risk of exposing the secret. The sample code below is using the env variable and the wrap token to get its secret.

I am using node-vault, you can install node-vault by > npm install node-vault

In the above code we can see that we have hard coded the roleid into our code, but we mitigated the risk of someone stealing our docker env variable by having a limited time to live for the wrap token to authenticate and get our secrets from Vault. The command to generate the wrap token is below.

When we run the code we would do something like, with the third parameter being the wrap token.

If we try to reuse the wrap token we will get an error.

Summary

We have covered using roleid and secret using Vault to authenticate and get our application secrets. There is definitely a down side to this, but we mitigated the risk by having a short time for the wrap token to live. There is also the option of having the wrap token mounted as a volume for your application and pick it up from the volume that was mounted. I will cover those topics in a later blog post.

    Advantages

  • Even with docker inspect we are only are able to see the token which is last with a TTL and is a guid token the attack surface would be lower.
  • A very simple pattern to follow, all config are stored with application Name (e.g secret/appName/config, etc)
    Disadvantages

  • The microservice would require some Restful call to Vault to consume data, may have dependency on vault library to consume it
  • DevOps or some scripts would still be required to put secrets in the correct place
  • Failure of Vault what happens? Service does not start

Source code at
https://github.com/taswar/weatherappwithvault

Yesterday I gave a talk on using ForwardJS conference here in Ottawa on Hashicorp Vault – Using Vault for your Nodejs Secrets.

You can find the slides here on slideshare.

Transcript:

  1. Using vault for your NodeJS Secrets Taswar Bhatti – Solutions Architect Gemalto
  2. Secrets •
  3. About me • Taswar Bhatti (Microsoft MVP) • @taswarbhatti • http://taswar.zeytinsoft.com • Gemalto (System Architect)
  4. So what are secrets? • Secrets grants you AuthN or AuthZ to a system • Examples • Username & Passwords • Database credentials • API Token • TLS Certs
  5. Secret Sprawl • Secrets ends up in • Source Code • Version Control Systems (Github, Gitlab, Bitbucket etc) • Configuration Management (Chef, Puppet, Ansible etc)
  6. Issues • How do we know who has access to those secrets • When was the last time they accessed it? • What if we want to change/rotate the secrets
  7. Desire secrets • Encryption in rest and transit • Only decrypted in memory • Access control • Rotation & Revocation
  8. Secret Management – Vault • Centralized Secret Management • Encrypted at rest and transit • Lease and Renewal • ACL • Audit Trail • Multiple Client Auth Method (Ldap,Github, approle) • Dynamic Secrets • Encryption as a Service
  9. Dynamic Secrets • Allows one to lease a secret for a period of time e.g 2 hrs • Generates on demand and unique for each user/consumption • Audit trail
  10. Secure Secrets • AES 256 with GCM encryption • TLS 1.2 for clients • No HSM is required
  11. Unsealing the Vault • Vault requires encryption keys to encrypt data • Shamir Secret Key Sharing • Master key is split into multiple keys
  12. Shamir Secret Sharing
  13. Unseal • Unseal Key 1: QZdnKsOyGXaWoB2viLBBWLlIpU+tQrQy49D+Mq24/V0B • Unseal Key 2: 1pxViFucRZDJ+kpXAeefepdmLwU6QpsFZwseOIPqaPAC • Unseal Key 3: bw+yIvxrXR5k8VoLqS5NGW4bjuZym2usm/PvCAaMh8UD • Unseal Key 4: o40xl6lcQo8+DgTQ0QJxkw0BgS5n6XHNtWOgBbt7LKYE • Unseal Key 5: Gh7WPQ6rWgGTBRSMecuj8PR8IM0vMIFkSZtRNT4dw5MF • Initial Root Token: 5b781ff4-eee8-d6a1-ea42-88428a7e8815 • Vault initialized with 5 keys and a key threshold of 3. Please • securely distribute the above keys. When the Vault is re-sealed, • restarted, or stopped, you must provide at least 3 of these keys • to unseal it again. • Vault does not store the master key. Without at least 3 keys, • your Vault will remain permanently sealed.
  14. How to unseal • vault unseal -address=${VAULT_ADDR} QZdnKsOyGXaWoB2viLBBWLlIpU+tQrQy49D+Mq24/V0B • vault unseal -address=${VAULT_ADDR} bw+yIvxrXR5k8VoLqS5NGW4bjuZym2usm/PvCAaMh8UD • vault unseal -address=${VAULT_ADDR} Gh7WPQ6rWgGTBRSMecuj8PR8IM0vMIFkSZtRNT4dw5MF
  15. Writing Secrets • vault write -address=${VAULT_ADDR} secret/hello value=world • vault read -address=${VAULT_ADDR} secret/hello • Key Value • — —– • refresh_interval 768h0m0s • Value world
  16. Policy on secrets • We can assign application roles to the policy path “secret/web/*” { policy = “read” } • vault policy write -address=${VAULT_ADDR} web-policy ${DIR}/web-policy.hcl
  17. Reading secrets based on policy • vault read -address=${VAULT_ADDR} secret/web/web-apps • vault read -address=${VAULT_ADDR} secret/hello • Error reading secret/hello: Error making API request. • URL: GET http://127.0.0.1:8200/v1/secret/hello • Code: 403. Errors: • * permission denied
  18. Demo Using Vault
  19. Demo Docker Environment VAR • Issues with env variables
  20. Mount Temp File System into App • docker run –v /hostsecerts:/secerts 
. • To mitigate reading from Env • Store your wrap token in the filesystem to use with vault • Have limit time on wrap token
  21. Wrap Token for App Secrets • Limit time token • Used to unwrap some secrets • vault read -wrap-ttl=60s -address=http://127.0.0.1:8200 secret/weatherapp/config • Key Value • — —– • wrapping_token: 35093b2a-60d4-224d-5f16-b802c82de1e7 • wrapping_token_ttl: 1m0s • wrapping_token_creation_time: 2017-09-06 09:29:03.4892595 +0000 UTC • wrapping_token_creation_path: secret/weatherapp/config
  22. App Roles • Allows machines or apps to authenticate with Vault • Using a role_id and secret_id as credentials • Assign polices to the app • Once logged in you get back a token to get secrets
  23. Demo App Using Node
  24. Kubernetes with Vault • Read Service Account JWT • App Sends Jwt and Role Name to Vault • Vault checks the signature of Jwt • Sends to TokenReviewer API • Vault sends back valid token for app
  25. Thankyou • Contact me (taswar.bhatti@gemalto.com) • @taswarbhatti
nodejs-vscode-docker-logo

In this post, we will continue our previous nodejs weather application and introduce Docker for NodeJS application using VSCode. We will build our application and host it in docker by adding a Dockerfile. We will modify our code to take the APIKEY from an environment variable.

Create Dockerfile

Let’s launch our editor VSCode again with our solution and add the .dockerignore file first, since there are things that we do not want in our Docker image, kind of like gitignore.

Next add a Dockerfile with vscode so that we can build our image, we will use the carbon-alpine image which is v8 of nodejs, the reason we choose alpine is because its small and has minimal size and is secure.

Carbon is the latest LTS (long term support) version of node

Above we will create a working directory and copy our package*.json then we run npm to install our packages. We also expose the default port 3000 and we run our application with npm start.

Note: using –only=production will run npm in production (will not download devDependencies)

Modify our source code

Before we build our image, we need to modify our app.js such that the APIKEY is no longer in the source code. Let’s make it so that it takes it from an environment variable. In doing so we will need to pass our APIKEY in our environment when we start to run our docker image, more on this later.

Build our NodeJS Image

Now that we modified app.js we can now go back to our docker file and build our image. Hit Ctrl+` in your vscode to launch terminal.

Once finished building you can see your docker image by this command

Run the image

Since we modified app.js we now need to inject the APIKEY into our docker image, we can do so with this command in vscode terminal using powershell. You need to use the proper APIKEY that you have got from openweather. We are also exposing the inside port 3000 to our local machine port of 8080

Now we should be able to view our application on http://localhost:8080

weatherapp-8080

weatherapp-8080

Security Issue

So you may think that now that we pass our APIKEY through the environment we are all good, but there is still a security issue here. If someone hacks into your machine that runs docker they are still able to see your secrets. Say What???

saywhat

Say What?

Lets try to inspect our docker container by using the following commands, first lets find out the container id that docker is running under

Now we can inspect the container by using this command, we don’t have to type the entire id, the first 3 characters would do the trick.

If we look at the config section of the inspect we will see our APIKEY is in plain text

Now you may say someone needs to have root access to view this information but maybe there is a better way we can mitigate this by maybe using a Token. This is where a secret key management comes in to help in storing your secrets. In the next blog post we will go over what Vault is and how it helps us in storing secrets and exposing a token for our docker container to consume without us exposing the APIKEY.

The source code of this can be found at https://github.com/taswar/nodejs-weather-app

nodejs-vscode-logo

In this post I wanted to go through how do we create a Nodejs application with vscode. The application is yet another weather node app, it is an easy example and loosely based on this project (https://codeburst.io/build-a-weather-website-in-30-minutes-with-node-js-express-openweather-a317f904897b), the difference is that we will build upon it to a series of how do we store secrets in node.js application. In the first part we will start by creating a weather app that would require an APIKEY to access an external API.

So let’s get started, first thing first, we need to get us an apikey
1. Head over to https://openweathermap.org/api and set yourself up with an api key, its free.

Let’s now create a node.js express application, let’s open up a cmd or powershell prompt and install express generator

We will now create the weather application, also we will be using ejs for our view and adding the .gitignore file by using the –git option.

In order to run our application we will need to run npm install and afterwards run vscode with it to open our project up with our favorite IDE VsCode.

We should now see the view of vs code.

vscode-nodejs

vscode-nodejs

Let’s open up our index.ejs located in the views folder and modify the html file there to something like this.

We also need to add a style2.css so that our html page looks a bit better. Add a new file in your public/stylesheets folder. The css is taken by the article at (https://codeburst.io/build-a-weather-website-in-30-minutes-with-node-js-express-openweather-a317f904897b)

Now that we have views in place, lets try to tackle our routes. Open up index.js in routes folder.

Last we need to modify our app.js and add our APIKEY to it such that our route/index.js can access it. Add these 2 lines above your 404 code in app.js

Now we can run our application in vscode. Hit Ctrl+` and get your terminal up and running. Type npm start, and you should be able to hit http://localhost:3000 to view your site.

vscode-node-run-app

vscode-node-run-app

Now you can test out your weather by just typing in the city.

vscode-node-run-app-ottawa

vscode-node-run-app-ottawa

One can view the source code at https://github.com/taswar/nodejs-weather-app

In our next post we will be dockerizing our solution so that we can run the app on docker and we will try to tackle the APIKEY issue which is embedded right now in our source code.

ElasticStackTalk

Yesterday I gave a talk on using Elastic Search for .NET Developers here in Ottawa. The slides used were from mostly from my presentation at DevTeach Montreal last year.

You can find the slides here on slideshare.

Transcript:

1. STORE 2 MILLION OF AUDIT LOGS A DAY INTO ELASTICSEARCH Taswar Bhatti (Microsoft MVP) GEMALTO @taswarbhatti http://taswar.zeytinsoft.co m taswar@gmail.com
2. WHO AM I? – 4 years Microsoft MVP – 17 years in software industry – Currently working as System Architect in Enterprise Security Space (Gemalto) – You may not have heard of Gemalto but 1/3 of the world population uses Gemalto they just dont know it – Gemalto has stacks build in many environnent .NET, Java, Node, Lua, Python, mobile (Android, IOS), ebanking etc
3. AGENDA – Problem we had and wanted to solve with Elastic Stack – Intro to Elastic Stack (Ecosystem) – Logstash – Kibana – Beats – Elastic Search flows designs that we have considered – Future plans of using Elastic Search
4. QUESTION & POLL – How many of you are using Elastic or some other logging solution? – How do you normally log? Where do you log? – Do you log in Relational Database?
5. HOW DO YOU TROUBLESHOOT OR FIND YOUR BUGS – Typically in a distributed environment one has to go through the logs to find out where the issue is – Could be multiple systems that you have to go through which machine/server generated the log or monitoring multiple logs – Even monitor firewall logs to find traffic routing through which data center – Chuck Norris never troubleshoot; the trouble kills themselves when they see him coming
6. Image
7. OUR PROBLEM – We had distributed systems (microservices) that would generate many different types of logs, in different data centers – We also had authentication audit logs that had to be secure and stored for 1 year – We generate around 2 millions records of audit logs a day, 4TB with replications – We need to generate reports out of our data for customers – We were still using Monolith Solution in some core parts of the application – Growing pains of a successful application – We want to use a centralized scalable logging system for all our
8. FINDING BUGS THROUGH LOGS
9. A LITTLE HISTORY OF ELASTICSEARCH – Shay Banon created Compass in 2004 – Released Elastic Search 1.0 in 2010 – ElasticSearch the company was formed in 2012 – Shay wife is still waiting for her recipe app
10. Image
11. ELASTIC STACK
12. ELASTICSEARCH – Written in Java backed by Lucene – Schema free, REST & JSON based document store – Search Engine – Distributed, Horizontally Scalable – No database storage, storage is Lucene – Apache 2.0 License
13. COMPANIES USING ELASTIC STACK
14. ELASTICSEARCH INDICES – Elastic organizes document in indices – Lucene writes and maintains the index files – ElasticSearch writes and maintains metadata on top of Lucene – Example: field mappings, index settings and other cluster metadata
15. DATABASE VS ELASTIC
16. ELASTIC CONCEPTS – Cluster : A cluster is a collection of one or more nodes (servers) – Node : A node is a single server that is part of your cluster, stores your data, and participates in the cluster’s indexing and search capabilities – Index : An index is a collection of documents that have somewhat similar characteristics. (e.g Product, Customer, etc) – Type : Within an index, you can define one or more types. A type is a logical category/partition of your index. – Document : A document is a basic unit of information that can be indexed – Shard/Replica: Index divided into multiple pieces called shards, replicas are copy of your shards
17. ELASTIC NODES – Master Node : which controls the cluster – Data Node : Data nodes hold data and perform data related operations such as CRUD, search, and aggregations. – Ingest Node : Ingest nodes are able to apply an ingest pipeline to a document in order to transform and enrich the document before indexing – Coordinating Node : only route requests, handle the search reduce phase, and distribute bulk indexing.
18. SAMPLE JSON DOCUMENT HTTP CALL JSON DOCUMENT
19. ELASTICSEARCH CLUSTER
20. TYPICAL CLUSTER SHARD & REPLICA
21. SHARD SEARCH AND INDEX
22. DEMO OF ELASTICSEARCH
23. LOGSTASH – Ruby application runs under JRuby on the JVM – Collects, parse, enrich data – Horizontally scalable – Apache 2.0 License – Large amount of public plugins written by Community https://github.com/logstash- plugins
24. TYPICAL USAGE OF LOGSTASH
25. Image
26. LOGSTASH INPUT
27. LOGSTASH FILTER
28. LOGSTASH OUTPUT
29. DEMO LOGSTASH
30. BEATS
31. BEATS – Lightweight shippers written in Golang (Non JVM shops can use them) – They follow unix philosophy; do one specific thing, and do it well – Filebeat : Logfile (think of it tail –f on steroids) – Metricbeat : CPU, Memory (like top), redis, mongodb usage – Packetbeat : Wireshark uses libpcap, monitoring packet http etc – Winlogbeat : Windows event logs to elastic – Dockbeat : Monitoring docker – Large community lots of other beats offered as opensource
32. Image
33. FILEBEAT
34. X-PACK – Elastic commercial offering (This is one of the ways they make money) – X-Pack is an Elastic Stack extension that bundles – Security (https to elastic, password to access Kibana) – Alerting – Monitoring – Reporting – Graph capabilities – Machine Learning
35. Image
36. KIBANA – Visual Application for Elastic Search (JS, Angular, D3) – Powerful frontend for dashboard for visualizing index information from elastic search – Historical data to form charts, graphs etc – Realtime search for index information
37. Image
38. DEMO KIBANA
39. DESIGNS WE WENT THROUGH – We started with simple design to measure throughput – One instance of logstash and one instance of ElasticSearch with filebeat 9/22/2017 39
40. DOTNET CORE APP – We used a dotnetcore application to generate logs – Serilog to generate into json format and stored on file – Filebeat was installed on the linux machine to ship the logs to logstash
41. PERFORMANCE ELASTIC – 250 logs item per second for 30 minutes
42. OVERVIEW
43. LOGSTASH
44. ELASTIC SEARCH RUN TWO – 1000 logs per second, run for 30 minutes
45. PERFORMANCE
46. OTHER DESIGNS
47. WHAT WE ARE GOING WITH FOR NOW, UNTIL
..
48. CONSIDERATIONS OF DATA – Index by day make sense in some cases – In other you may want to index by size rather (Black Friday more traffic than other days) when Shards are not balance ElasticSearch doesn’t like that – Don’t index everything, if you are not going to search on specific fields mark them as text
49. FUTURE CONSIDERATIONS – Investigate into Elastic Search Machine learning – ElasticSearch with Kafka for cross data center replication
50. THANK YOU & OPEN TO QUESTIONS – Questions??? – Contact: Taswar@gmail.com – Blog: http://Taswar.zeytinsoft.com – Twitter: @taswarbhatti – LinkedIn (find me and add me)

Azure functions

I wanted to try out Azure Functions to see if I can serve html data from Azure Functions that I get from another third party site. As in, if I use RestSharp and call another site from my Azure Function, extract its content and just display it as output to html. This can come in handy, lets say you have a proxy that is blocking you to get to certain sites and you just want to extract some data from the site and process it. Since Azure function is so cheap, I thought I would give it a try.

Here is a sequence diagram to explain the idea of it.

azureFuncBrowser

azureFuncBrowser

So the idea is quite simple, you make an Azure Function that can serve you the site html since you are blocked to access it.

We first need to go and create ourselves a HttpTrigger C# function in azure. If you wish to learn more about how to create an HTTP trigger you can read my previous post on Timer Event Trigger which is very similar to http trigger.

First we will need to add a new file called project.json, such that we can add our RestSharp Nuget package.

For our code we will write just a simple GET call with RestSharp and serve it back as a stream

You can then just simply view the Url of your azure function by calling it in a browser. Yes the images wont show up property, but all we want is the html to extract some data of, so we are ok that images dont show up properly.

cbc

cbc

There you have it to serve html data from third party site with Azure Functions.

Azure functions

Was playing around with Azure functions and thought I would write a quick blog post on how to use Azure functions to read a text file and send an email out to me daily. Basically I went with a joke a day idea. So lets get started with writing some of the awesome code 🙂

Warning Prerequisites Assumed
If you don’t have an Azure subscription, you can always create a free account before you begin.
Azure function
Login into your Azure account and click the New button found on the upper left-hand corner of the Azure portal, then select Compute > Function App. You should see a create button for function app.

Function App

I have named my app to zeytinmail since it was available, also used a new Resource Group, you can use an existing one.
Hosting plan is set to consumption plan and since I am in East Coast I have set up location in East US.
For storage I am just using a auto generated one, I could use an existing one but for simplicity sake just using a default one that was auto generated.

Function App Create

Now that we have it created we should see the overview of our function app.
Azure Function Overview

We can now click on function and create a Timer function with CSharp.
Timer Function

We should see the run.csx file open up with the Run method pre-populated.
run.csx

We will then upload the jokes.txt file onto the server. I got my jokes.txt from https://github.com/rdegges/yomomma-api/blob/master/jokes.txt. I just clicked on Add Button and created the file and copy and pasted the text into the file.

From there in order to read the file in Azure Function all I have to do is put the path of the file in my code.

File Path
You can find out the path by using Kudu in Azure function, usually they are stored in D drive with D:\home\site\wwwroot\{functionName}\file

Your solution would look something like below:
Timer Code

SendGrid
Next up we need to create yourself a SendGrid account.
SendGrid
Sendgrid has free account to send email out.

We need to set up our Application Setting to have our SendGrid API Key. We can go into the Application Setting and Add a new key, lets call it SendGridKey
sendgrid key

We need to go into the Integrate section of the function where we want a new Output for our function. We will be selecting the SendGrid output to send an email out.
SendGrid Out

We can then put in our Api Key App Settings like below and from address if we want.
SendGrid ApiKey

We now need to modify our function.json file to add SendGrid information, add this inside the binding array as another section.

Finally we need to write the code to send the email out.

With this we will be able to send the email to a user with a joke of the day.
EmailJoke

Last but not least we need to also set a schedule by clicking on Integrate and set the time for the schedule to send an email out. If you want to learn more about cron timer you can visit this site to learn more about it. https://codehollow.com/2017/02/azure-functions-time-trigger-cron-cheat-sheet/
Azure Timer Schedule Azure Timer Schedule

Summary

Here it is how to send an email out with a joke with SendGrid and Azure Functions. Have fun 😛

elastic search

For the past year I have been evaluating and working and even presented ElasticSearch, and I thought it would be good to showcase a series of article on ElasticSearch for .NET Developers. What it brings to the table when developing a software solution. I also did a talk on ElasticSearch at Montreal DevTeach, if you are interested in my slides feel free to view them on slideshare or my blog.

Without further adieu, lets get started and lets look at what ElasticSearch really is.

First off, ElasticSearch some consider it as ELK Stack but for new branding they have been trying to call themselves Elastic Stack rather, although the ELK has been stuck with many people and google searches, but we from here on we will call it just Elastic Stack.

So what does the Elastic Stack consist of you may wonder?
Basically the Elastic Stack consist of ElasticSearch, Logstash and Kibana. Lets go through them individually so that we can understand what each component does and brings to a software solution.

ElasticSearch

ElasticSearch

ElasticSearch

This is the core main search engine or store that you use for storing your data, it is build in Java. It stores documents in json format and uses Lucene to index it, elastic search provides and builds metadata upon the index that was created by Lucene (Note: Lucene is build in Java, there is also a port of Lucene to .NET called NLucene)

Some people may think that ElasticSearch is a database that we store data into like mysql, postgres or mssql, but I would say Elastic is not really a database since there is no db file and does not have relationships like SQL. Its more like a NOSQL solution but not quite like mongodb either. The best thing to describe it, I would say is think of it as a Search Engine where you store documents in. I know its confusing at first but don’t worry it will come clear later or once you start playing around with it.

Logstash

logstash

logstash

Logstash is another module/component/service. You can use logstash without using ElasticSearch, the main functionality of Logstash is to get some input, filter it and output it somewhere, again the output does not need to be ElasticSearch but usually it is. An example of logstash could be I have IIS logs or Apache Logs I need to input them into logstash, and I would like to geo tag each of the IP address and store them into ElasticSearch or some database. Main idea of Logstash is (INPUT -> FILTER -> OUTPUT) simple. One more thing to note is Logstash is build with JRuby on the JVM and there are tones of open source plugins for Logstash that one can download, even to anonymized the data or encrypt etc before outputting the data.

Kibana

kibana

kibana

Kibana is the graphical user interface for ElasticSearch, it is used analyzing your data and for creating charts from ElasticSearch data. It is quite powerful, one can slice and dice many kinds of charts using Kibana.
Kibana is build with node.js and its a single page app (SPA) application.

Beats

beats

beats

Beats are basically light weight shippers of data. There are many types of beats, eg. filebeat is used for shipping file data (e.g apache.log) to ElasticSearch or Logstash. Winlogbeat allows one to ship windows events to ElasticSearch or Logstash, check out the beats offered by Elastic; you can also write your own beat using the ibbeat library, and not to mention that beats are actually written in GoLang. If you are interested in using Golang with VSCode check out the channel 9 video I did for golang and vscode.

So here we sum up the main components of Elastic Stack, I will go through each component individually in upcoming blog post, going through install process to configuration.

satazureday

I had the opportunity to speak at satazureday Azure Saturday here in Ottawa last week, and went through the topic of Azure Key Vault. I also had a co-presenter to share the talk with; an upcoming public speaker Petrica Mihai. He created most of the slides and the demo code in C# 🙂
You can view the code at https://github.com/mihaipetri/AzureKeyVaultNet

In any case if you are interested here are the slides on Azure Key Vault.

And the transcript:

  1. Azure Key Vault • What are we trying to solve with KeyVault?
    • Let’s step back and look at a Cloud Design Pattern
    • External Configuration Pattern
  2. External Configuration Pattern
  3. Typical Application
  4. Storing Configuration in file
  5. Multiple application
  6. External Configuration Pattern
    • Helps move configuration information out of the application deployment
    • his pattern can provide for easier management and control of configuration data
    • For sharing configuration data across applications and other application instances
  7. Problems
    • Configuration becomes part of deployment
    • Multiple applications share the same configuration
    • Hard to have access control over the configuration
  8. External Configuration Pattern
  9. When to use the pattern
    • When you have shared configuration, multiple application
    • You want to manage configuration centrally by DevOps
    • Provide audit for each configuration
  10. When not to use
    • When you only have a single application there is no need to use this pattern it will make things more complex
  11. Cloud Solution Offerings
    • Azure KeyVault (Today’sTalk)
    • Vault by Hashicorp
    • AWS KMS
    • Keywhiz
  12. What is Azure Key Vault ?
    • Safe1guard cryptographic keys and secrets used by cloud applications and services
    • Use hardware security modules (HSMs)
    • Simplify and automate tasks for SSL/TLS certificates
  13. Gemalto / SafeNet – Hardware Security Module
  14. How Azure Key Vault can help you ?
    • Customers can import their own keys into Azure, and manage them
    • Keys are stored in a vault and invoked by URI when needed
    • KeyVault performs cryptographic operations on behalf of the application
    • The application does not see the customers’ keys
    • KeyVault is designed so that Microsoft does not see or extract your keys • Near real-time logging of key usage
  15. Bring Your Own Key (BYOK)
  16. Create a Key Vault New-AzureRmKeyVault -VaultName ‘MihaiKeyVault’ -ResourceGroupName ‘MihaiResourceGroup’ -Location ‘Canada East’
  17. Objects, identifiers, and versioning
    • Objects stored in Azure KeyVault (keys, secrets, certificates) retain versions whenever a new instance of an object is created, and each version has a unique identifier and URL
    • https://{keyvault-name}.vault.azure.net/{object-type}/{object- name}/{object-version}
  18. Azure Key Vault keys
    • Cryptographic keys in Azure KeyVault are represented as JSONWeb Key [JWK] objects
    • RSA: A 2048-bit RSA key.This is a “soft” key, which is processed in software by KeyVault but is stored encrypted at rest using a system key that is in an HSM
    • RSA-HSM: An RSA key that is processed in an HSM
    • https://myvault.vault.azure.net/keys/mykey/abcdea84815e4ca8bc19c f8eb943ee88
  19. Create a Key Vault key $key = Add-AzureKeyVaultKey -VaultName ‘MihaiKeyVault’ -Name ‘MihaiFirstKey’ -Destination ‘Software’
  20. Azure Key Vault secrets
    • Secrets are octet sequences with a maximum size of 25k bytes each
    • The Azure KeyVault service does not provide any semantics for secrets; it accepts the data, encrypts and stores it, returning a secret identifier, “id”, that may be used to retrieve the secret
    • https://myvault.vault.azure.net/secrets/mysecret/abcdea54614e4ca7 ge14cf2eb943ab23
    • Create a Key Vault secret $secret = Set-AzureKeyVaultSecret -VaultName ‘MihaiKeyVault’ -Name ‘SQLPassword’ -SecretValue $secretvalue
    • Azure Key Vault certificates
      • Import/generate existing certificates, self-signed or Enroll from Public Certificate Authority (DigiCert, GlobalSign andWoSign)
      • When a KeyVault certificate is created, an addressable key and secret are also created with the same name
      • https://myvault.vault.azure.net/certificates/mycertificate/abcdea848 15e4ca8bc19cf8eb943bb45
    • Create a Key Vault certificate
    • Secure your Key Vault
      • Access to a key vault is controlled through two separate interfaces: management plane and data plane
      • Authentication establishes the identity of the caller
      • Authorization determines what operations the caller is allowed to perform
      • For authentication both management plane and data plane use Azure Active Directory
      • For authorization, management plane uses role-based access control (RBAC) while data plane uses key vault access policy
    • Access Control
      • Access Control based on Azure AD
      • Access assigned at theVault level
      • – permissions to keys
      • – permissions to secrets
      • Authentication against AzureAD
      • – application ID and key
      • – application ID and certificate
    • Azure Managed Service Identity (MSI)
      • Manage the credentials that need to be in your code for authenticating to cloud services
      • Azure KeyVault provides a way to securely store credentials and other keys and secrets, but your code needs to authenticate to Key Vault to retrieve them
      • Managed Service Identity (MSI) makes solving this problem simpler by giving Azure services an automatically managed identity in Azure Active Directory (Azure AD)
      • You can use this identity to authenticate to any service that supports AzureAD authentication, including KeyVault, without having any credentials in your code

      Azure Key Vault Logging

      • Monitor how and when your key vaults are accessed, and by whom
      • Save information in an Azure storage account that you provide
      • Use standard Azure access control methods to secure your logs by restricting who can access them
      • Delete logs that you no longer want to keep in your storage account
    • Azure Key Vault Pricing • Operations (Standard or Premium) $0.030 per 10000 operations
      • Advanced Operations (Standard or Premium) $0.150 per 10000 operations
      • Certificate Renewals (Standard or Premium) $3.00 per renewal
      • Hardware Security Module Protected Keys (Premium only) $1.00 per key
    • Azure Key Vault DEMO
      • Create KeyVault, Secrets, Keys and Certificates
      • Create AzureAD Application
      • Consuming Secrets and Keys https://azurekeyvaultnet.azurewebsites.net – live demo
      • https://github.com/mihaipetri/AzureKeyVaultNet – demo code
UA-4524639-2