AI looking for talented freshers who love to immerse themselves in data to find new insights and meanings within them. This internship opportunity can be converted into a full-time position if we can tick all the boxes.
Selected intern's day-to-day responsibilities include
1.Analyze large amounts of information to discover trends and patterns.
2.Web-scrape and build data pipelines using Airflow and PostgreSQL.
3.Build dashboards on Apache Superset and configure the pipeline.
4.Host the pipeline on AWS instances.
Other requirements
1.Must have knowledge of SQL, Python, and PySpark.
2.Must have familiarity with any of the SQL and NoSQL databases.
3.Must have familiarity with ETL and ELT pipelines.
4.Must be an expert in web scraping using Python Selenium and Beautiful Soup.
5.Must have knowledge of the basics of data warehousing concepts.
6.Must have familiarity with Apache Airflow, Apache NiFi, or Kafka (advantage).
7.Must have experience using business intelligence tools (e.g. Tableau) and data framework.
8.Must be comfortable with any of the BI tools such as Power BI or Tableau
9. Must have a BSc/B.Tech in computer science, engineering or any other relevant degree with a good certification in data engineering would be preferred.