Thursday, February 16, 2023

data structure basic

 As a data professional, understanding data structures is essential to optimizing your code and making it more efficient. Here are 10 key points to keep in mind:


1. Data structures are tools that enable you to store and manipulate data effectively. They include arrays, linked lists, stacks, queues, trees, and more

2. Each data structure has its own unique properties and advantages, so it's important to choose the right one for your needs

3. Arrays are useful for storing and accessing data quickly, while linked lists are better for dynamic data that needs to be updated frequently

4. Stacks and queues are often used for managing workflows, and trees and graphs are useful for representing hierarchical or networked data

5. It's important to understand the time and space complexity of different data structures, as they can have a big impact on the performance of your code

6. Understanding the trade-offs between different data structures is crucial when optimizing code. For example, while hash tables have very fast lookup times, they can be memory-intensive and have a higher chance of collisions than other data structures

7. Memory allocation and deallocation are important considerations when working with data structures. In some cases, it may be more efficient to pre-allocate memory for a data structure rather than allocating and deallocating it dynamically

8. Advanced data structures like self-balancing binary search trees and hash tables with open addressing can be powerful tools for handling large amounts of data efficiently. However, they also require a deeper understanding of algorithms and data structures

9. While data structures are a fundamental part of computer science, they are just one tool in your toolbox. When designing algorithms, it's important to consider the entire problem and choose the best approach based on factors like time complexity, space complexity, and maintainability

10. Finally, it's worth noting that choosing the right data structure is just the first step. You also need to know how to implement it effectively and optimize it for your use case

Wednesday, February 15, 2023

Short Overview of SQL commands

DDL commands:

1. Create 

2. Alter 

3. Drop 

4. Truncate

5. Rename 


DML commands:

1.  Select

2.  Insert

3. Update

4. Delete


DCL commands:

1. Grant

2. Revoke



Tuesday, February 14, 2023

python app on AWS Lambda


  1. Create a new AWS Lambda function: Go to the AWS Lambda console and create a new function by selecting "Create function". You can choose to start with a blueprint or create a function from scratch.

  2. Choose Python as the runtime: For the runtime, choose Python. You can also choose the version of Python you want to use.

  3. Write your Python code: You can write your Python code in the inline code editor in the AWS Management Console or you can upload a .zip file containing your code.

  4. Configure your function: You need to configure your function's triggers and other settings, such as environment variables, memory size, and timeout. You can do this in the AWS Management Console or using the AWS CLI.

  5. Deploy your function: After writing your code and configuring your function, you can deploy it by clicking the "Deploy" button in the AWS Management Console or using the AWS CLI.

  6. Test your function: You can test your function in the AWS Management Console by providing test inputs and checking the function's output.

  7. Monitor your function: You can monitor your function's performance, invocations, and error rates using Amazon CloudWatch.

sys module

 sys module contains a lot of information about pythons import system. First of all , the list of modules currently imported is available through the sys.modules variable. Its a dictionary where the key is the module name and the value is the module object.

>> sys.modules['os']

< module 'os' from /usr/lib/python2.7/os.pyc'>


Standard libraries :


atexit allows you to register functions to call when your program exits.

argparse provides functions for parsing command line arguments.

bisect provides bisection algorithms for sorting lists

calendar provides a number of date-related functions

codecs provides functions for encoding and decoding data

collections provides a variety of useful data structures.

copy provide functions for copying data.

csv provides functions for reading and writing CSV files

datetime provides classes for handling dates and times 

fnmatch provides functions for matching unix-style filename patterns.



git commands - part 2

 git rm - it removes files from the repository and the file system.

 git mv - it renames or move files within the repository.

 git cherry-pick - It selectively applies changes from a specific commit to the current branch.

 git revert - it undoes changes by creating a new commit that reverses previous commits.

git clean - it removes untracked files and directories from the working directory.

git archive - creates a zip archive of the repository

git bisect - it performs a binary search through the commit history to find a specific change.

git submodule - it includes one git repository within another as a sub-directory.

git grep - it searches for a specific string or pattern in the repository.

git lfs - it manages large files and binary assets in a git repository.

 


Python List Methods

 


Forward proxy vs Reverse proxy

 Forward Proxy

---------------
A forward proxy, also known as a "proxy server," or simply "proxy," is a server that sits in front of one or more client machines and acts as an intermediary between the clients and the internet. When a client machine makes a request to a resource on the internet, the request is first sent to the forward proxy. The forward proxy then forwards the request to the internet on behalf of the client machine and returns the response to the client machine.

A forwards proxy is mostly used for:
1. Client Anonymity
2. Caching
3. Traffic Control
4. Logging
5. Request/Response Transformation
6. Encryption

Reverse Proxy
---------------
A reverse proxy is a server that sits in front of one or more web servers and acts as an intermediary between the web servers and the Internet. When a client makes a request to a resource on the internet, the request is first sent to the reverse proxy. The reverse proxy then forwards the request to one of the web servers, which returns the response to the reverse proxy. The reverse proxy then returns the response to the client.

A reverse proxy is mostly used for:
1. Server Anonymity
2. Caching
3. Load Balancing
4. DDoS Protection
5. Canary Experimentation
6. URL/Content Rewriting




Monday, February 13, 2023

SOAP API

 SOAP ( Simple Object Access Protocol ) is a messaging protocol that allows programs that run on disparate operating systems or services like frontend or backend to communicate using HTTP and its extensible markup language ( XML)



SOAP uses WSDL is an XML format for describing network services as a set of endpoints operating on messages containing either document-oriented or procedure-oriented information.

what to Test in API testing 


Validate the keys with Min and Max range of APIS

Have a testcase to do XML, JSON schema validation

keys verification. If we have JSON , XML apis we should verify its that all the keys are coming.

Verify that how the APIs error codes handled.




what is API ? In a easy language

 API stands for the Application Programming Interface, They are basically a collection of functions and procedures which allows us to communicate two application or library.


For example : 

It like a connector as seen in the picture. All data connects to our organization through API.






In one line, API is its an interface between different software programs or service.


Restaurant API example:


API is the messenger that takes your order(waiter) and tells the system(kitchen) what to do (to prepare food) and in return gives back the response you asked (waiter returns with the ordered food).

Types of APIs 


we are only concern about the Web API 

Simple Object Access Protocol ( SOAP )
Remote Procedure Call (RPC)
Representational State Transfer (REST)


what is API ?

API testing is testing that APIs and its integration with the services

In this guide, we are basically discussing about the REST API Testing. Where we need to test the REST APIs for the validation, error codes and load testing.

What is REST API ?

As REST is an acronym for REPRESNTATIONAL STATE TRANSFER, statelessness is key. An API can be REST if it follows the below constraints.

The REST architecture style describes six constraints:

1. uniform interface 2. stateless 3. cacheable 4. client server 5. layered system 6. code on demand.

Uniform interface : Uniform interface constraint defines the interface between clients and servers.

In other terms : First constraint of the REST API states that the client and server has to communicate and agree to certain rules based on resources (they should communicate with same resource like json, xml, html, txt) and with proper encoding like UTF-8 extra.

Another point they should communicate with the self-descriptive Messages e.g: Use the same MIME types.

Stateless

APIs in REST are stateless and Client and Server does not worry about the state of the request or response.


cacheable

According to the WWW, clients can cache responses. Responses should therefore, implicitly or explicitly, define themselves as cacheable, its upto server when they want the cache to expired etc.

Client-server

Client and server are two different entity, it means that servers and clients may also be replaced and developed independently as long as the interface is not altered.

Layered system

It  means that the between client and server there can be any number of layered systems it does not matter.

Code on Demand 

Server can store the code or logic to themselves and transfer it whenever needed rather client side logic.


Sunday, February 12, 2023

Write REST APIs IN PYTHON - ROUTER

 In FastAPI, a router is a mechanism for grouping a set of related endpoints and applying common functionality to all of them, such as prefixing their URLs or applying middleware to all requests to the endpoints.


Routers are created using the APIRouter class in FastAPI, which provides a convenient way to define multiple routes in a single place, and then include all of those routes in your main FastAPI application using the include_router method.

For example, if you have a set of endpoints for managing items in an e-commerce website, you could define all of those endpoints in a single file using an APIRouter and then include the router in your main FastAPI application. This would allow you to apply common functionality to all of those endpoints such as adding an authentication middleware to ensure that only authorized users can access the endpoints.

The main advantage of using routers in FastAPI is that they allow you to organize your application into smaller, more manageable parts making it easier to maintain and extend your code.

# File endpoints/items.py 

from fastapi import APIRouter

router = APIRouter()

@router.get("/{item_id}")

async def read_item(item_id: int, q: str = None):

         return {"item_id" : item_id, "q" : q}


@router.put("/{item_id}")

async def update_item(item_id: int, q: str = None):

         return {"item_id" : item_id, "item" : item}



# File main.py

from fastapi import FastAPI

from .endpoints import items 


app = FastAPI()

app.include_router(items.router, prefix="/items")