Learning python II

 Recap

The fundamentals of Python were addressed in the previous blog post at https://learnpythonbybrar.blogspot.com/2023/02/learning-python.html. Props, state, the lifecycle, the python's structure, and many other topics were covered. But what if I told you that the skills I taught you are still insufficient for creating sophisticated programs? What else can I learn now, you must be thinking. Be at ease; I've got you. This blog post will walk you through the example project, which demonstrates how to apply the ideas we've learned and how to put everything together. 

Let's begin the project, then.


Project

Today we will make a project on topic : WEB SCRAPING WITH PYTHON

Introduction

Web scraping is the process of automatically extracting data from websites. In this assignment, we will learn how to scrape data from a website using Python. We will use the requests and BeautifulSoup libraries to fetch and parse HTML data.

Assignment


In this assignment, we will be scraping data from a website that contains information about movies. The website we will be using is IMDB (https://www.imdb.com). We will extract the top-rated movies from the website and store the data in a CSV file.

Steps:

1. Import the necessary libraries:

The first step is to import the necessary libraries. We will be using the requests and BeautifulSoup libraries to fetch and parse the HTML data.


2. Fetch the HTML data:

Next, we will use the requests library to fetch the HTML data from the IMDB website. We will pass the URL of the page we want to scrape as an argument to the requests.get() function.


3. Parse the HTML data:

Once we have fetched the HTML data, we will use the BeautifulSoup library to parse it. We will create a BeautifulSoup object and pass the HTML data and the parser we want to use as arguments.


4. Extract the data:

Now that we have parsed the HTML data, we can extract the data we want. We will be extracting the title, year, and rating of each movie on the page.


5. Store the data in a CSV file:

Finally, we will store the data we have extracted in a CSV file. We will create a CSV file and write the data to it using the csv.writer() function.



Demonstration

Let's run the above code and see how it works.



Once the code is executed, a new file named 'top-rated-movies.csv' will be created in the same directory as the Python script. This file will contain the data we scraped from the IMDB website in the form of a table.

Conclusion

In this assignment, we learned how to scrape data from a website using Python. We used the requests and BeautifulSoup libraries to fetch and parse HTML data, and then we extracted the data we wanted and stored it in a CSV file. Web scraping can be a powerful tool for collecting data from websites, and it can be used for a variety of purposes, such as data analysis, research, and automation.

Topic II

Next, we will write code in Python for the Quick sort Algorithm. I am pretty sure all of us have heard about it or even written codes on it in JAVA. 

Here are the instructions for it.

Write a Python program that implements the QuickSort algorithm to sort an array of integers. The program should prompt the user to enter the array elements, and then sort and output the array in ascending order.

Here are some basic requirements,
  1. The program should use the input() function to prompt the user for input.
  2. The program should use the split() method to split the user input into separate elements.
  3. The program should use the map() function to convert the input values to integers.
  4. The program should implement the QuickSort algorithm to sort the array.
  5. The program should output the sorted array using the print() function in the format shown in the example output above.

Implementation




In the above code, we first define the quicksort() function to implement the QuickSort algorithm. The function takes an array as input, and recursively partitions the array into smaller subarrays based on a pivot value. We use list comprehension to create subarrays for elements less than, equal to, and greater than the pivot value. We then recursively sort the left and right subarrays, and combine them with the middle subarray (containing the pivot value) to form the sorted array.


Next, we prompt the user to enter the array elements as a string, using the input() function. We then split the string into separate elements using the split() method, and convert each element to an integer using the map() function.

We then call the quicksort() function with the input array to sort it using the QuickSort algorithm. We store the sorted array in the sorted_arr variable.

Finally, we output the sorted array using the print() function. We use string formatting to include the array in the output.

Comments