Eliminate Column: Sathish, an MBA graduate is currently put to a project of Event Management. Soon he started to accept orders and while working on it, he felt it very hard to maintain the records so he decided to purchase a system which helps him to keep track all events and maintain all upcoming and future events.
Pleased by the system built by the development team for his previous requirement, he requests the team to upgrade the system which would clean the data set according to columns as well. Write a code to build a system to enter the number of events(n) that the dataset accepts followed by n event details one after the other. While updating event if any field among the mentioned is missing delete the entire column details.
Note:
Column name in the dataset should be as follows
( ‘Sl No’,’EventName’,’EventManager’,’NoOfDates’,’StartDate’,’EndDate’,’HallName’,’HallAddress’,’HallPricePerDay’,’TotalCost’ )
Eliminate Column using Pandas
pandas.DataFrame.drop: https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.drop.html
Input Format:
The First line of the input consists of the number row in the dataset(n).
Next, n lines correspond to the data to be added to the dataset.
Output Format:
The output consists of data in the dataset after dropping the columns which had the missing value in it.
Note: Use tabulate module to print the output in given format.
Sample Input-Output:
Enter number of records
8
1 ,Marraige,krishnanand,2,12-08-2005,13-08-2005,krishna mansion,Krishna Mansion Krishnalaya Shringeri,60000,120000
2 ,Birthday Celebration,Akshay,1,14-08-2005,14-08-2005,Akshayalaya,Akshayalaya Akshaya Dama Yogendra Nagar Shimoga,35000,35000
3 ,House Warming,sunil,1,21-10-2005,21-10-2005,Sugamya Corner,Sugamya corner Vijaya nagar Mysore,50000,50000
4 ,Naming Ceremony,Surya,1,10-11-2005,10-11-2005,Suryodaya comforts,Suryodaya comforts Srirampura,20000,
5 ,Promossion Celebration,Aliya,1,15-11-2005,15-11-2005,Kabule convention hall,Kabule convention hall Bangalore,20000,20000
6 ,New Year Celebration,,2,31-12-2005,01-12-2005,Chinnaswammy stadium,Bangalore,,24000
7 ,Chrismas Celebration,Collin,2,24-12-2005,,Saint Philominas chruch,Saint Philominas chruch Mysore,25000,50000
7 ,Chrismas Celebration,Collin,2,24-12-2005,,Saint Philominas chruch,Saint Philominas chruch Mysore,25000,50000
+—+——-+————————+———–+————+————————-+————————————————–+
| | Sl No | EventName | NoOfDates | StartDate | HallName | HallAddress |
+—+——-+————————+———–+————+————————-+————————————————–+
| 0 | 1 | Marraige | 2 | 12-08-2005 | krishna mansion | Krishna Mansion Krishnalaya Shringeri |
| 1 | 2 | Birthday Celebration | 1 | 14-08-2005 | Akshayalaya | Akshayalaya Akshaya Dama Yogendra Nagar Shimoga |
| 2 | 3 | House Warming | 1 | 21-10-2005 | Sugamya Corner | Sugamya corner Vijaya nagar Mysore |
| 3 | 4 | Naming Ceremony | 1 | 10-11-2005 | Suryodaya comforts | Suryodaya comforts Srirampura |
| 4 | 5 | Promossion Celebration
| 1 | 15-11-2005 | Kabule convention hall | Kabule convention hall Bangalore |
| 5 | 6 | New Year Celebration | 2 | 31-12-2005 | Chinnaswammy stadium | Bangalore |
| 6 | 7 | Chrismas Celebration | 2 | 24-12-2005 | Saint Philominas chruch | Saint Philominas chruch Mysore |
| 6 | 7 | Chrismas Celebration | 2 | 24-12-2005 | Saint Philominas chruch | Saint Philominas chruch Mysore |
+—+——-+————————+———–+————+————————-+————————————————–+
Sample Input-Output 2:
Enter number of record
O
Invalid Input
Additional Sample TestCases
Sample Input and Output 1 :
Enter number records 0 Invalid Input
Solutions:
import pandas as pd
import numpy as np
from tabulate import tabulate
n = int(input("Enter number of record\n"))
df = pd.DataFrame(columns=['Sl No', 'EventName', 'EventManager', 'NoOfDates', 'StartDate', 'EndDate', 'HallName', 'HallAddress', 'HallPricePerDay', 'TotalCost'])
if n <= 0:
print("Invalid Input")
else:
for i in range(n):
datf = input()
datf = datf.split(',')
df.loc[i] = datf
df = df.replace(r'^\s*$', np.nan, regex=True)
print(tabulate(df.dropna(axis='columns'), headers='keys', tablefmt='psql'))
Happy Learning – If you require any further information, feel free to contact me.