Taswar Bhatti
The synonyms of software simplicity
ElasticStackTalk

Yesterday I gave a talk on using Elastic Search for .NET Developers here in Ottawa. The slides used were from mostly from my presentation at DevTeach Montreal last year.

You can find the slides here on slideshare.

Transcript:

1. STORE 2 MILLION OF AUDIT LOGS A DAY INTO ELASTICSEARCH Taswar Bhatti (Microsoft MVP) GEMALTO @taswarbhatti http://taswar.zeytinsoft.co m taswar@gmail.com
2. WHO AM I? – 4 years Microsoft MVP – 17 years in software industry – Currently working as System Architect in Enterprise Security Space (Gemalto) – You may not have heard of Gemalto but 1/3 of the world population uses Gemalto they just dont know it – Gemalto has stacks build in many environnent .NET, Java, Node, Lua, Python, mobile (Android, IOS), ebanking etc
3. AGENDA – Problem we had and wanted to solve with Elastic Stack – Intro to Elastic Stack (Ecosystem) – Logstash – Kibana – Beats – Elastic Search flows designs that we have considered – Future plans of using Elastic Search
4. QUESTION & POLL – How many of you are using Elastic or some other logging solution? – How do you normally log? Where do you log? – Do you log in Relational Database?
5. HOW DO YOU TROUBLESHOOT OR FIND YOUR BUGS – Typically in a distributed environment one has to go through the logs to find out where the issue is – Could be multiple systems that you have to go through which machine/server generated the log or monitoring multiple logs – Even monitor firewall logs to find traffic routing through which data center – Chuck Norris never troubleshoot; the trouble kills themselves when they see him coming
6. Image
7. OUR PROBLEM – We had distributed systems (microservices) that would generate many different types of logs, in different data centers – We also had authentication audit logs that had to be secure and stored for 1 year – We generate around 2 millions records of audit logs a day, 4TB with replications – We need to generate reports out of our data for customers – We were still using Monolith Solution in some core parts of the application – Growing pains of a successful application – We want to use a centralized scalable logging system for all our
8. FINDING BUGS THROUGH LOGS
9. A LITTLE HISTORY OF ELASTICSEARCH – Shay Banon created Compass in 2004 – Released Elastic Search 1.0 in 2010 – ElasticSearch the company was formed in 2012 – Shay wife is still waiting for her recipe app
10. Image
11. ELASTIC STACK
12. ELASTICSEARCH – Written in Java backed by Lucene – Schema free, REST & JSON based document store – Search Engine – Distributed, Horizontally Scalable – No database storage, storage is Lucene – Apache 2.0 License
13. COMPANIES USING ELASTIC STACK
14. ELASTICSEARCH INDICES – Elastic organizes document in indices – Lucene writes and maintains the index files – ElasticSearch writes and maintains metadata on top of Lucene – Example: field mappings, index settings and other cluster metadata
15. DATABASE VS ELASTIC
16. ELASTIC CONCEPTS – Cluster : A cluster is a collection of one or more nodes (servers) – Node : A node is a single server that is part of your cluster, stores your data, and participates in the cluster’s indexing and search capabilities – Index : An index is a collection of documents that have somewhat similar characteristics. (e.g Product, Customer, etc) – Type : Within an index, you can define one or more types. A type is a logical category/partition of your index. – Document : A document is a basic unit of information that can be indexed – Shard/Replica: Index divided into multiple pieces called shards, replicas are copy of your shards
17. ELASTIC NODES – Master Node : which controls the cluster – Data Node : Data nodes hold data and perform data related operations such as CRUD, search, and aggregations. – Ingest Node : Ingest nodes are able to apply an ingest pipeline to a document in order to transform and enrich the document before indexing – Coordinating Node : only route requests, handle the search reduce phase, and distribute bulk indexing.
18. SAMPLE JSON DOCUMENT HTTP CALL JSON DOCUMENT
19. ELASTICSEARCH CLUSTER
20. TYPICAL CLUSTER SHARD & REPLICA
21. SHARD SEARCH AND INDEX
22. DEMO OF ELASTICSEARCH
23. LOGSTASH – Ruby application runs under JRuby on the JVM – Collects, parse, enrich data – Horizontally scalable – Apache 2.0 License – Large amount of public plugins written by Community https://github.com/logstash- plugins
24. TYPICAL USAGE OF LOGSTASH
25. Image
26. LOGSTASH INPUT
27. LOGSTASH FILTER
28. LOGSTASH OUTPUT
29. DEMO LOGSTASH
30. BEATS
31. BEATS – Lightweight shippers written in Golang (Non JVM shops can use them) – They follow unix philosophy; do one specific thing, and do it well – Filebeat : Logfile (think of it tail –f on steroids) – Metricbeat : CPU, Memory (like top), redis, mongodb usage – Packetbeat : Wireshark uses libpcap, monitoring packet http etc – Winlogbeat : Windows event logs to elastic – Dockbeat : Monitoring docker – Large community lots of other beats offered as opensource
32. Image
33. FILEBEAT
34. X-PACK – Elastic commercial offering (This is one of the ways they make money) – X-Pack is an Elastic Stack extension that bundles – Security (https to elastic, password to access Kibana) – Alerting – Monitoring – Reporting – Graph capabilities – Machine Learning
35. Image
36. KIBANA – Visual Application for Elastic Search (JS, Angular, D3) – Powerful frontend for dashboard for visualizing index information from elastic search – Historical data to form charts, graphs etc – Realtime search for index information
37. Image
38. DEMO KIBANA
39. DESIGNS WE WENT THROUGH – We started with simple design to measure throughput – One instance of logstash and one instance of ElasticSearch with filebeat 9/22/2017 39
40. DOTNET CORE APP – We used a dotnetcore application to generate logs – Serilog to generate into json format and stored on file – Filebeat was installed on the linux machine to ship the logs to logstash
41. PERFORMANCE ELASTIC – 250 logs item per second for 30 minutes
42. OVERVIEW
43. LOGSTASH
44. ELASTIC SEARCH RUN TWO – 1000 logs per second, run for 30 minutes
45. PERFORMANCE
46. OTHER DESIGNS
47. WHAT WE ARE GOING WITH FOR NOW, UNTIL…..
48. CONSIDERATIONS OF DATA – Index by day make sense in some cases – In other you may want to index by size rather (Black Friday more traffic than other days) when Shards are not balance ElasticSearch doesn’t like that – Don’t index everything, if you are not going to search on specific fields mark them as text
49. FUTURE CONSIDERATIONS – Investigate into Elastic Search Machine learning – ElasticSearch with Kafka for cross data center replication
50. THANK YOU & OPEN TO QUESTIONS – Questions??? – Contact: Taswar@gmail.com – Blog: http://Taswar.zeytinsoft.com – Twitter: @taswarbhatti – LinkedIn (find me and add me)

Azure functions

I wanted to try out Azure Functions to see if I can serve html data from Azure Functions that I get from another third party site. As in, if I use RestSharp and call another site from my Azure Function, extract its content and just display it as output to html. This can come in handy, lets say you have a proxy that is blocking you to get to certain sites and you just want to extract some data from the site and process it. Since Azure function is so cheap, I thought I would give it a try.

Here is a sequence diagram to explain the idea of it.

azureFuncBrowser

azureFuncBrowser

So the idea is quite simple, you make an Azure Function that can serve you the site html since you are blocked to access it.

We first need to go and create ourselves a HttpTrigger C# function in azure. If you wish to learn more about how to create an HTTP trigger you can read my previous post on Timer Event Trigger which is very similar to http trigger.

First we will need to add a new file called project.json, such that we can add our RestSharp Nuget package.

For our code we will write just a simple GET call with RestSharp and serve it back as a stream

You can then just simply view the Url of your azure function by calling it in a browser. Yes the images wont show up property, but all we want is the html to extract some data of, so we are ok that images dont show up properly.

cbc

cbc

There you have it to serve html data from third party site with Azure Functions.

Azure functions

Was playing around with Azure functions and thought I would write a quick blog post on how to use Azure functions to read a text file and send an email out to me daily. Basically I went with a joke a day idea. So lets get started with writing some of the awesome code 🙂

Warning Prerequisites Assumed
If you don’t have an Azure subscription, you can always create a free account before you begin.
Azure function
Login into your Azure account and click the New button found on the upper left-hand corner of the Azure portal, then select Compute > Function App. You should see a create button for function app.

Function App

I have named my app to zeytinmail since it was available, also used a new Resource Group, you can use an existing one.
Hosting plan is set to consumption plan and since I am in East Coast I have set up location in East US.
For storage I am just using a auto generated one, I could use an existing one but for simplicity sake just using a default one that was auto generated.

Function App Create

Now that we have it created we should see the overview of our function app.
Azure Function Overview

We can now click on function and create a Timer function with CSharp.
Timer Function

We should see the run.csx file open up with the Run method pre-populated.
run.csx

We will then upload the jokes.txt file onto the server. I got my jokes.txt from https://github.com/rdegges/yomomma-api/blob/master/jokes.txt. I just clicked on Add Button and created the file and copy and pasted the text into the file.

From there in order to read the file in Azure Function all I have to do is put the path of the file in my code.

File Path
You can find out the path by using Kudu in Azure function, usually they are stored in D drive with D:\home\site\wwwroot\{functionName}\file

Your solution would look something like below:
Timer Code

SendGrid
Next up we need to create yourself a SendGrid account.
SendGrid
Sendgrid has free account to send email out.

We need to set up our Application Setting to have our SendGrid API Key. We can go into the Application Setting and Add a new key, lets call it SendGridKey
sendgrid key

We need to go into the Integrate section of the function where we want a new Output for our function. We will be selecting the SendGrid output to send an email out.
SendGrid Out

We can then put in our Api Key App Settings like below and from address if we want.
SendGrid ApiKey

We now need to modify our function.json file to add SendGrid information, add this inside the binding array as another section.

Finally we need to write the code to send the email out.

With this we will be able to send the email to a user with a joke of the day.
EmailJoke

Last but not least we need to also set a schedule by clicking on Integrate and set the time for the schedule to send an email out. If you want to learn more about cron timer you can visit this site to learn more about it. https://codehollow.com/2017/02/azure-functions-time-trigger-cron-cheat-sheet/
Azure Timer Schedule Azure Timer Schedule

Summary

Here it is how to send an email out with a joke with SendGrid and Azure Functions. Have fun 😛

elastic search

For the past year I have been evaluating and working and even presented ElasticSearch, and I thought it would be good to showcase a series of article on ElasticSearch for .NET Developers. What it brings to the table when developing a software solution. I also did a talk on ElasticSearch at Montreal DevTeach, if you are interested in my slides feel free to view them on slideshare or my blog.

Without further adieu, lets get started and lets look at what ElasticSearch really is.

First off, ElasticSearch some consider it as ELK Stack but for new branding they have been trying to call themselves Elastic Stack rather, although the ELK has been stuck with many people and google searches, but we from here on we will call it just Elastic Stack.

So what does the Elastic Stack consist of you may wonder?
Basically the Elastic Stack consist of ElasticSearch, Logstash and Kibana. Lets go through them individually so that we can understand what each component does and brings to a software solution.

ElasticSearch

ElasticSearch

ElasticSearch

This is the core main search engine or store that you use for storing your data, it is build in Java. It stores documents in json format and uses Lucene to index it, elastic search provides and builds metadata upon the index that was created by Lucene (Note: Lucene is build in Java, there is also a port of Lucene to .NET called NLucene)

Some people may think that ElasticSearch is a database that we store data into like mysql, postgres or mssql, but I would say Elastic is not really a database since there is no db file and does not have relationships like SQL. Its more like a NOSQL solution but not quite like mongodb either. The best thing to describe it, I would say is think of it as a Search Engine where you store documents in. I know its confusing at first but don’t worry it will come clear later or once you start playing around with it.

Logstash

logstash

logstash

Logstash is another module/component/service. You can use logstash without using ElasticSearch, the main functionality of Logstash is to get some input, filter it and output it somewhere, again the output does not need to be ElasticSearch but usually it is. An example of logstash could be I have IIS logs or Apache Logs I need to input them into logstash, and I would like to geo tag each of the IP address and store them into ElasticSearch or some database. Main idea of Logstash is (INPUT -> FILTER -> OUTPUT) simple. One more thing to note is Logstash is build with JRuby on the JVM and there are tones of open source plugins for Logstash that one can download, even to anonymized the data or encrypt etc before outputting the data.

Kibana

kibana

kibana

Kibana is the graphical user interface for ElasticSearch, it is used analyzing your data and for creating charts from ElasticSearch data. It is quite powerful, one can slice and dice many kinds of charts using Kibana.
Kibana is build with node.js and its a single page app (SPA) application.

Beats

beats

beats

Beats are basically light weight shippers of data. There are many types of beats, eg. filebeat is used for shipping file data (e.g apache.log) to ElasticSearch or Logstash. Winlogbeat allows one to ship windows events to ElasticSearch or Logstash, check out the beats offered by Elastic; you can also write your own beat using the ibbeat library, and not to mention that beats are actually written in GoLang. If you are interested in using Golang with VSCode check out the channel 9 video I did for golang and vscode.

So here we sum up the main components of Elastic Stack, I will go through each component individually in upcoming blog post, going through install process to configuration.

satazureday

I had the opportunity to speak at satazureday Azure Saturday here in Ottawa last week, and went through the topic of Azure Key Vault. I also had a co-presenter to share the talk with; an upcoming public speaker Petrica Mihai. He created most of the slides and the demo code in C# 🙂
You can view the code at https://github.com/mihaipetri/AzureKeyVaultNet

In any case if you are interested here are the slides on Azure Key Vault.

And the transcript:

  1. Azure Key Vault • What are we trying to solve with KeyVault?
    • Let’s step back and look at a Cloud Design Pattern
    • External Configuration Pattern
  2. External Configuration Pattern
  3. Typical Application
  4. Storing Configuration in file
  5. Multiple application
  6. External Configuration Pattern
    • Helps move configuration information out of the application deployment
    • his pattern can provide for easier management and control of configuration data
    • For sharing configuration data across applications and other application instances
  7. Problems
    • Configuration becomes part of deployment
    • Multiple applications share the same configuration
    • Hard to have access control over the configuration
  8. External Configuration Pattern
  9. When to use the pattern
    • When you have shared configuration, multiple application
    • You want to manage configuration centrally by DevOps
    • Provide audit for each configuration
  10. When not to use
    • When you only have a single application there is no need to use this pattern it will make things more complex
  11. Cloud Solution Offerings
    • Azure KeyVault (Today’sTalk)
    • Vault by Hashicorp
    • AWS KMS
    • Keywhiz
  12. What is Azure Key Vault ?
    • Safe1guard cryptographic keys and secrets used by cloud applications and services
    • Use hardware security modules (HSMs)
    • Simplify and automate tasks for SSL/TLS certificates
  13. Gemalto / SafeNet – Hardware Security Module
  14. How Azure Key Vault can help you ?
    • Customers can import their own keys into Azure, and manage them
    • Keys are stored in a vault and invoked by URI when needed
    • KeyVault performs cryptographic operations on behalf of the application
    • The application does not see the customers’ keys
    • KeyVault is designed so that Microsoft does not see or extract your keys • Near real-time logging of key usage
  15. Bring Your Own Key (BYOK)
  16. Create a Key Vault New-AzureRmKeyVault -VaultName ‘MihaiKeyVault’ -ResourceGroupName ‘MihaiResourceGroup’ -Location ‘Canada East’
  17. Objects, identifiers, and versioning
    • Objects stored in Azure KeyVault (keys, secrets, certificates) retain versions whenever a new instance of an object is created, and each version has a unique identifier and URL
    • https://{keyvault-name}.vault.azure.net/{object-type}/{object- name}/{object-version}
  18. Azure Key Vault keys
    • Cryptographic keys in Azure KeyVault are represented as JSONWeb Key [JWK] objects
    • RSA: A 2048-bit RSA key.This is a “soft” key, which is processed in software by KeyVault but is stored encrypted at rest using a system key that is in an HSM
    • RSA-HSM: An RSA key that is processed in an HSM
    • https://myvault.vault.azure.net/keys/mykey/abcdea84815e4ca8bc19c f8eb943ee88
  19. Create a Key Vault key $key = Add-AzureKeyVaultKey -VaultName ‘MihaiKeyVault’ -Name ‘MihaiFirstKey’ -Destination ‘Software’
  20. Azure Key Vault secrets
    • Secrets are octet sequences with a maximum size of 25k bytes each
    • The Azure KeyVault service does not provide any semantics for secrets; it accepts the data, encrypts and stores it, returning a secret identifier, “id”, that may be used to retrieve the secret
    • https://myvault.vault.azure.net/secrets/mysecret/abcdea54614e4ca7 ge14cf2eb943ab23
    • Create a Key Vault secret $secret = Set-AzureKeyVaultSecret -VaultName ‘MihaiKeyVault’ -Name ‘SQLPassword’ -SecretValue $secretvalue
    • Azure Key Vault certificates
      • Import/generate existing certificates, self-signed or Enroll from Public Certificate Authority (DigiCert, GlobalSign andWoSign)
      • When a KeyVault certificate is created, an addressable key and secret are also created with the same name
      • https://myvault.vault.azure.net/certificates/mycertificate/abcdea848 15e4ca8bc19cf8eb943bb45
    • Create a Key Vault certificate
    • Secure your Key Vault
      • Access to a key vault is controlled through two separate interfaces: management plane and data plane
      • Authentication establishes the identity of the caller
      • Authorization determines what operations the caller is allowed to perform
      • For authentication both management plane and data plane use Azure Active Directory
      • For authorization, management plane uses role-based access control (RBAC) while data plane uses key vault access policy
    • Access Control
      • Access Control based on Azure AD
      • Access assigned at theVault level
      • – permissions to keys
      • – permissions to secrets
      • Authentication against AzureAD
      • – application ID and key
      • – application ID and certificate
    • Azure Managed Service Identity (MSI)
      • Manage the credentials that need to be in your code for authenticating to cloud services
      • Azure KeyVault provides a way to securely store credentials and other keys and secrets, but your code needs to authenticate to Key Vault to retrieve them
      • Managed Service Identity (MSI) makes solving this problem simpler by giving Azure services an automatically managed identity in Azure Active Directory (Azure AD)
      • You can use this identity to authenticate to any service that supports AzureAD authentication, including KeyVault, without having any credentials in your code

      Azure Key Vault Logging

      • Monitor how and when your key vaults are accessed, and by whom
      • Save information in an Azure storage account that you provide
      • Use standard Azure access control methods to secure your logs by restricting who can access them
      • Delete logs that you no longer want to keep in your storage account
    • Azure Key Vault Pricing • Operations (Standard or Premium) $0.030 per 10000 operations
      • Advanced Operations (Standard or Premium) $0.150 per 10000 operations
      • Certificate Renewals (Standard or Premium) $3.00 per renewal
      • Hardware Security Module Protected Keys (Premium only) $1.00 per key
    • Azure Key Vault DEMO
      • Create KeyVault, Secrets, Keys and Certificates
      • Create AzureAD Application
      • Consuming Secrets and Keys https://azurekeyvaultnet.azurewebsites.net – live demo
      • https://github.com/mihaipetri/AzureKeyVaultNet – demo code
Taswar Bhatti - Cloud Design Patterns

This week I gave a talk on Cloud Design Patterns at the Ottawa .NET Community. I wanted to share the sides here and will most likely write on articles on the topic using real world examples. Samples in C# and node.js, in AWS and Azure.

For the talk I went through, what Cloud Design Patterns are and mainly focused on the patterns below without using any platform specifications. (i.e cloud agnostics)

  • The External Configuration Pattern
  • The Cache Aside Pattern
  • The Federated Identity Pattern
  • The Valet Key Pattern
  • The Gatekeeper Pattern
  • The Circuit Breaker Pattern

For each of them I also went through when you should use the pattern and when not to use it, I also provided Cloud Solutions Offering that one can use to implement the pattern.

Enjoy the slides 🙂

The Transcript:

  1. Agenda
    • What are Patterns?
    • The External Configuration Pattern
    • The Cache Aside Pattern
    • The Federated Identity Pattern
    • The Valet Key Pattern
    • The Gatekeeper Pattern
    • The Circuit Breaker Pattern
    • The Retry Pattern
    • The Strangler Pattern
  2. What are Patterns?
    • General reusable solution to a recurring problem
    • A template on how to solve a problem
    • Best practices
    • Patterns allow developers communicate with each other in well known and understand names for software interactions.
  3. External Configuration Pattern
    • Helps move configuration information out of the application deployment
    • This pattern can provide for easier management and control of configuration data
    • For sharing configuration data across applications and other application instances
  4. Typical Application
  5. Storing Configuration in file
  6. Multiple application
  7. Problems
    • Configuration becomes part of deployment
    • Multiple applications share the same configuration
    • Hard to have access control over the configuration
  8. External Configuration Pattern
  9. When to use the pattern
    • When you have shared configuration, multiple application
    • You want to manage configuration centrally by DevOps
    • Provide audit for each configuration
  10. When not to use
    • When you only have a single application there is no need to use this pattern it will make things more complex
  11. Cloud Solution Offerings
    • Azure Key Vault
    • Vault by Hashicorp
    • AWS KMS
    • Keywhiz
  12. Cache Aside Pattern
    • Load data on demand into a cache from datastore
    • Helps improve performance
    • Helps in maintain consistency between data held in the cache and data in the underlying data store.
  13. Typical Application
  14. Cache Aside Pattern
  15. When to use the pattern
    • Resource demand is unpredictable.
    • This pattern enables applications to load data on demand
    • It makes no assumptions about which data an application will require in advance
  16. When not to use
    • Don’t use it for data that changes very often
    • Things to consider
      • Sometimes data can be changed from outside process
      • Have an expiry for the data in cache
      • When update of data, invalidate the cache before updating the data in database
      • Pre populate the data if possible
    • Cloud Offerings
      • Redis (Azure and AWS)
      • Memcache
      • Hazelcast
      • Elastic Cache (AWS)
    • Federated Identity Pattern
      • Delegate authentication to an external identity provider.
      • Simplify development, minimize the requirement for user administration
      • Improve the user experience of the application
      • Centralized providing MFA for user authentication
    • Typical Application
    • Problem
      • Complex development and maintenance (Duplicated code)
      • MFA is not an easy thing
      • User administration is a pain with access control
      • Hard to keep system secure
      • No single sign on (SSO) everyone needs to login again to different systems
    • Federated Identity Pattern
    • When to use
      • When you have multiple applications and want to provide SSO for applications
      • Federated identity with multiple partners
      • Federated identity in SAAS application
    • When not to use it
      • You already have a single application and have custom code
      • that allows you to login
    • Things to consider
      • The identity Server needs to be highly available
      • Single point of failure, must have HA
      • RBAC, identity server usually does not have authorization information
      • Claims and scope within the security auth token
    • Cloud Offerings
      • Azure AD
      • Gemalto STA and SAS
      • Amazon IAM
      • GCP Cloud IAM
    • Valet Key Pattern
      • Use a token that provides clients with restricted direct access to a specific resource
      • Provide offload data transfer from the application
      • Minimize cost and maximize scalability and performance
    • Typical Application Client App Storage
    • Problem
    • Valet Key Pattern
    • Client App Generate Token Limited Time And Scope Storage
    • When to use it
      • The application has limited resources
      • To minimize operational cost
      • Many interaction with external resources (upload, download)
      • When the data is stored in a remote data store or a different datacenter
    • When not to use it
      • When you need to transform the data before upload or download
    • Cloud Offerings
      • Azure Blob Storage
      • Amazon S3
      • GCP Cloud Storage
    • Gatekeeper Pattern

      • Using a dedicated host instance that acts as a broker between clients and services
      • Protect applications and services
      • Validates and sanitizes requests, and passes requests and data between them
      • Provide an additional layer of security, and limit the attack surface of the system
    • Typical Application

    • Problem

    • Gatekeeper Pattern

    • When to use it

      • Sensitive information (Health care, Authentication)
      • Distributed System where perform request validation separately
    • When not to use

      • • Performance vs security
    • Things to consider

      • WAF should not hold any keys or sensitive information
      • Use a secure communication channel
      • Auto scale
      • Endpoint IP address (when scaling application does the WAF know the new applications)
    • Circuit Breaker Pattern

      • To handle faults that might take a variable amount of time to recover
      • When connecting to a remote service or resource
    • Typical Application

    • Problem

    • When to use it
      • To prevent an application from trying to invoke a remote service or access a shared resource if this operation is highly likely to fail
      • Better user experience
    • When not to use
      • Handling access to local private resources in an application, such as in-memory data structure
      • Creates an overhead
      • Not a substitute for handling exceptions in the business logic of your applications
    • Libraries
      • • Polly (http://www.thepollyproject.org/)
      • • Netflix (Hystrix) https://github.com/Netflix/Hystrix/wiki
    • Retry pattern
      • Enable an application to handle transient failures
      • When the applications tries to connect to a service or network resource
      • By transparently retrying a failed operation
    • Typical Application has Network Failure

    • Retry Pattern
      • • Retry after 2, 5 or 10 seconds
    • When to use it
      • Use retry for only transient failure that is more than likely to resolve themselves quicky
      • Match the retry policies with the application
      • Otherwise use the circuit break pattern
    • When not to use it
      • Don’t cause a chain reaction to all components
      • For internal exceptions caused by business logic
      • Log all retry attempts to the service
    • Libraries
      • Roll your own code
      • Polly (http://www.thepollyproject.org/)
      • Netflix (Hystrix) https://github.com/Netflix/Hystrix/wiki
    • Strangler Pattern

      • Incrementally migrate a legacy system
      • Gradually replacing specific pieces of functionality with new applications and services
      • Features from the legacy system are replaced by new system features eventually
      • Strangling the old system and allowing you to decommission it
    • Monolith Application
    • When to use
      • Gradually migrating a back-end application to a new architecture
    • When not to use
      • When requests to the back-end system cannot be intercepted
      • For smaller systems where the complexity of wholesale replacement is low
    • Considerations
      • Handle services and data stores that are potentially used by both new and legacy systems.
      • Make sure both can access these resources side-by-side
      • When migration is complete, the strangler façade will either go away or evolve into an adaptor for legacy clients
      • Make sure the façade doesn't become a single point of failure or a performance bottleneck.
Redis

Redis GeoSpatial data sets are actually just SortedSets in Redis, there is no secret about it. Basically it provides an easy way to store geo spatial data like longitude/latitude coordinates into Redis.Lets look at some of the commands that Redis provides for Geo Spatial data.

Redis Geo Datatype – Operations

  • GEOADD: Adds or updates one or more members to a Geo Set O(log (N)) where N is the number of elements in the sorted set.
  • GEODIST: Return the distance between two members in the geo spatial index represented by the sorted set O(log(N)).
  • GEOHASH: Gets valid Geohash strings representing the position of one or more elements from the Geo Sets O(log(N)), where N is the number of elements in the sorted set.
  • GEOPOS: Return the longitude,latitude of all the specified members of the geo spatial sorted set at key O(log(N)), where N is the number of elements in the sorted set.
  • GEORADIUS: Return the members of a sorted set populated with geo spatial information using GEOADD, which are within the borders of the area specified with the center location and the maximum distance from the center (the radius) O(N+log(M))
  • GEORADIUSBYMEMBER: Same as GEORADIUS with the only difference that instead of taking, as the center of the area to query, it takes a longitude and latitude value O(N+log(M))

I wanted to use some open data to showcase the usage of Redis GeoSpatial data in Redis. I chose to use Basketball courts in Ottawa since my son plays a bit of basketball.
Here is a map of basketball courts in Ottawa.

ottawa-basketball-court

Ottawa Basketball Courts

C# code using Redis Geo Set Datatype

So this covers the basic usage of Redish GeoSpatial Datatype, in the next blog post I will cover using Sentinel which provides high availability for Redis.

For the code please visit
https://github.com/taswar/RedisForNetDevelopers/tree/master/13.RedisGeo/RediusGeo

For previous Redis topics

  1. Intro to Redis for .NET Developers
  2. Redis for .NET Developer – Connecting with C#
  3. Redis for .NET Developer – String Datatype
  4. Redis for .NET Developer – String Datatype part 2
  5. Redis for .NET Developer – Hash Datatype
  6. Redis for .NET Developer – List Datatype
  7. Redis for .NET Developer – Redis Sets Datatype
  8. Redis for .NET Developer – Redis Sorted Sets Datatype
  9. Redis for .NET Developer – Redis Hyperloglog
  10. Redis for .NET Developer – Redis Pub Sub
  11. Redis for .NET Developers – Redis Pipeline Batching
  12. Redis for .NET Developers – Redis Transactions
  13. Redis for .NET Developers – Lua Scripting
  14. Redis for .NET Developers – Redis running in Docker
  15. Redis for .NET Developers – Redis running in Azure
  16. Redis for .NET Developers – Redis running in AWS ElastiCache
Redis

Redis Lua Scripting

Redis provides a way to extend its functionality on the server side by providing support for Lua Scripting. If you are coming from a relational database world, you already know that you can use Stored Procedures to extend functionality of your relational database. Now, you may also know that some people do frown upon using stored procedures, I think one could also think of using scripting in Redis sort of belongs in the same category. Nevertheless its still good to know what you can do with Redis and Lua.

If you want to learn more Lua try this site http://tylerneylon.com/a/learn-lua/

In order to call Lua Script from Redis.StackExchange library one can use the LuaScript class or IServer.ScriptLoad(Async), IServer.ScriptExists(Async), IServer.ScriptFlush(Async), IDatabase.ScriptEvaluate, and IDatabaseAsync.ScriptEvaluateAsync methods.

Lets try to do something with the redis console first by using the redis-cli

As you can see we first load the script and we get back a SHA1 Hash from it. Redis basically stores the information in one of its mapping table and we can reuse the sha1 hash to call the script by using the EVALSHA command, which in this case gave us back “hello redis”

Remember: When Redis is running your Lua script, it will not run anything else because Redis is single threaded

C# code using Redis Lua Script

You may realize that every time we call the server we need to load the script and then execute on it, there is also another way that StackExchange Redis allows us to to avoid the overhead of transmitting the script text for every call. One can convert a LuaScript into a LoadedLuaScript like the code below:

You would cache the loaded value somewhere in your application, usually its best to load the scripts when you start your application

So this covers the basic usage of Redis LuaScript, in the next blog post I will cover how to use Geo spatial data in Redis.

For the code please visit
https://github.com/taswar/RedisForNetDevelopers/tree/master/12.RedisLua/RedisLua

For previous Redis topics

  1. Intro to Redis for .NET Developers
  2. Redis for .NET Developer – Connecting with C#
  3. Redis for .NET Developer – String Datatype
  4. Redis for .NET Developer – String Datatype part 2
  5. Redis for .NET Developer – Hash Datatype
  6. Redis for .NET Developer – List Datatype
  7. Redis for .NET Developer – Redis Sets Datatype
  8. Redis for .NET Developer – Redis Sorted Sets Datatype
  9. Redis for .NET Developer – Redis Hyperloglog
  10. Redis for .NET Developer – Redis Pub Sub
  11. Redis for .NET Developers – Redis Pipeline Batching
  12. Redis for .NET Developers – Redis Transactions
oauth and openid_

Wanted to share my DevTeach Montreal 2017 talk where I talked about OAuth and OpenId Connect. The types of OAuth Grants, how to consume them, the flows in OAuth and what OpenId Connect comes into play, what does it solve.

Hope you like the presentation and if you are interested in more security topics, ping me and let me know what would you be interested in.

elastic search

Wanted to share my DevTeach talk slides on Elastic Search. Where I went into introducing the Elastic Stack. Consisting of Elastic Search, Logstash and Kibana. I also went into the constraints that we had and the design approaches that we took.

Hope you enjoy and expect more ElasticSearch blogs this year 🙂

UA-4524639-2