Thursday, November 18, 2021

web app is routed through an Azure Application Gateway instance

 In the Azure Application Gateway's HTTP setting,


* enable the Use for App Service setting.

* set the value of the override backend path option to kaushik**.net


transaction logs asynchronously

 what is the purpose of the change feed ?

The purpose of the change feed is to provide transaction logs of all the changes that occur to the blobs and the blob metadata in your storage account.

The change feed provides ordered, guaranteed, durable, immutable, read-only log of these changes. Client  applications can read these logs at any time, either in streaming or in batch mode.

The change feed enables you to build efficient and scalable solutions that process change events that occur in your Blob Storage account at a low cost.


Sunday, November 14, 2021

Azure Function triggered from the blob upload

 Azure Storage events allow applications to react to events.

Common Blob storage event scenarios include image or video processing, search indexing or any file-oriented workflow.

Events are pushed using Azure Event Grid to subscribers such as Azure functions, Azure Logic apps or even to your own http listener.


what is Azure Event hub is used for ?

 Azure Event Hub is used for telemetry & distributed data streaming.

This service provides a single solution that enables rapid data retrieval for real-time processing as well as repeated replay of stored raw data.

It can capture the streaming data into a file for processing and analysis.


It has the following characteristics.

low latency.

capable of receiving and processing millions of events per second.

at least once delivery.



steps for integration accounts with azure logic apps and enterprise integration pack ?

 1. Create an integration account in the azure portal 

you can define custom metadata for artifacts in integration accounts and get that metadata during runtime for your logic app to use.

For example, you can provide metadata for artifacts such as partners , agreements, schemas & maps

- all store metadata using key-value pairs.


2. link the logic app to the integration account

A logic app thats linked to the integration account and artifact metadata you want to use.


3. Add partners, schemas, certificates, maps & agreements.


4. Create a custom connector for the logic app.


which secure function is used by Function app ?

 User claims 

Azure AD uses JSON based tokens ( JWTs ) that contain claims.


How a web app delegated sign-in to Azure AD and obtains a token ?

User authentication happens via the browser. The OpenID protocol uses standard HTTP protocol messages.


how to use the Integration Service Environment in Azure ?

 we can access to Vnet from Azure Logic Apps by using Integration service environments ( ISEs ).

( scenario, when to use it , sometimes your logic apps and integration accounts need access to secured resources , such as virtual machines ( VMs ) and other systems or services, that are inside an Azure virtual network.

To setup this access, you can create an ISE , where you can run your logic apps and create your integration accounts.

reference 

https://docs.microsoft.com/en-us/azure/logic-apps/connect-virtual-network-vnet-isolated-environment-overview


advanced linux operations

1)  a_prefixes.csv | grep ,dev, | egrep ^100 | grep /2 | sed 's/,.*//' >  a_prefix.txt

2)  for I in $(cat b_prefixes.txt); do  grep -s $I a_prefix.txt || echo $I ; done > delete_prefix.txt | wc -l

3)  for I in $(cat b_prefixes.txt); do  grep -s $I a_prefix.txt && echo $I ; done > delete_prefixes.txt

4)  od -c b_prefixes.txt

5) grep -q 100.**.**.0/22 b_prefixes.txt

6) for I in $(cat b_prefixes.txt); do  grep -q $I a_prefix.txt || echo $I ; done > delete_prefixes.txt


Friday, November 12, 2021

examples of powershell & azure

 >  connection 

Connect-AzAccount

>  Get info 

Get-AzVirtualNetwork

> assign to a variable 

$vnet_list = Get-AzVirtualNetwork

> test

$vnet_list

>  create an empty variable

$prefix_list=new-object collections.arraylist

>  test

$prefix_list

$prefix_list.count

$vnet_list[0]

$vnet_list[0].AddressSpace


>  filter specific ipaddress

$prefix_list_10=$prefix_list | where-object{$_ -like "10.*/2*"}


>  test 

$prefix_list_100

$prefix_list_100.count


> write to file 

$prefix_list_10 | out-file cloud_prefixes.txt -encoding ascii