Aishwarya Sasane

Aishwarya SasaneAishwarya SasaneAishwarya Sasane
  • Home
  • Skills
  • Projects
  • More
    • Home
    • Skills
    • Projects

Aishwarya Sasane

Aishwarya SasaneAishwarya SasaneAishwarya Sasane
  • Home
  • Skills
  • Projects

◘ DATA ANALYTICS & DATA VISUALIZATION

Data Analytics & Data Visualization are at the heart of transforming raw data into meaningful business insights. 


My experience with data analytics & data visualization:


  • I have developed 20+ executive-level reports and dashboards—ranging from marketing performance and revenue trends to campaign conversion and customer segmentation—for clients in sectors such as financial services, retail, and auto-services.  Using tools like Power BI, Tableau, and Excel, I’ve enabled data-driven decisions by simplifying complex data into clear, visual narratives. 
  • At Crocs, I built a Power BI dashboard to continuously monitor data quality, enabling teams to proactively resolve 100+ data issues and maintain trusted, high-quality product and financial data.  
  • My analytics work also includes performing A/B testing on email campaigns, which contributed to a 5–7% increase in open rates and drove $1M in revenue uplift. 
  •  I’ve analyzed consumer purchase behavior to uncover cross-sell and up-sell opportunities leading to a reactivation of 5% of dormant consumers.
  • In addition to my technical expertise, I bring strong domain knowledge across Finance, Retail, Pharma, Ed-Tech, and Auto-Services having consulted companies across these sectors which has enhanced the depth and relevance of my analytics work. 
  • For the Finance clients, I’ve worked with mutual funds and insurance datasets—conducting distributor performance analysis, identifying next-best offer opportunities by analyzing customer investment ratios, and performing competitor benchmarking to support strategic decision-making. 
  • For the Retail clients, I led market basket analysis initiatives that significantly increased average basket value by identifying high-performing product combinations. 
  • For the Pharmaceutical clients, I supported data-integration initiatives for regulatory compliance and enhanced analytics. 
  • For the Ed-Tech client, I utilized web analytics tools, including Google Analytics, to analyze website performance and gain insights into customer engagement trends and course popularity. By merging this data with subscriber information from databases, I was able to provide a comprehensive view of user behavior, enabling more informed decision-making and helping to optimize the platform’s content and user experience. 
  • For the Auto-Services client, I designed and implemented an optimized OLAP data warehouse model, specifically tailored to support advanced analytics and streamline reporting processes. This initiative significantly reduced reporting time and enhanced the overall efficiency of data analysis, enabling faster and more accurate decision-making. 


Presenting insights to both technical and non-technical audiences has sharpened my ability to turn data into action. 

◘ DATA GOVERNANCE & DATA QUALITY

Data Governance encompasses a range of activities aimed at enhancing data understandability, establishing controls, and preventing insertion & usage of incorrect data. This ensures that an organization adheres to data protection laws & regulations such as GDPR & HIPAA. 


My experience with data governance:


1. Documentation: My experience includes preparing comprehensive documentation such as: 

  • Business Metadata: Captures standardized definitions, business logic, and ownership of data points. This helped drive data quality rules and established accountability for data assets.
  • Technical Metadata: Includes technical information such as data types, system backend names, data lineage, and data transformation processes in downstream systems. This supported data architecture & system controls.
  • Compliance Documentation: Certifications like  ISO 27001:2022, ISO PIMS require organizations to maintain specific documentation for evaluating the safety of data & information assets. This preserved important context & ensured a common understanding of terms & acronyms across the organization.


2. Data Quality:  I have evaluated & enhanced data quality by leveraging key metrics and applying domain-specific knowledge to ensure accuracy and relevance. Additionally, I have developed data quality monitoring dashboards to enable proactive resolution of errors on a continuous basis.

  • Metrics: I have measured data quality by accuracy, completeness, timeliness, consistency, and uniqueness. Each industry & organization has its own data quality rules to assess these metrics.
  • Domain expertise: I have developed significant business domain knowledge & collaborated with subject matter experts to develop these rules effectively.


3. System Controls & User Access Management: I have worked closely with IT security teams to implement robust security measures and identified & mitigated data-related risks.

  • Controls: I have implemented system controls to prevent unauthorized data usage & ensure data integrity.
  • User Access Management: I have ensured proper user access  to data, typically following the principle of data minimization & principle of least privilege.


Overall, I have applied data governance principles to ensure the availability of high-quality, trusted data—empowering informed decision-making and driving business success across the organization. 

◘ DATA MODELING & DATA MANAGEMENT

Data Modeling is the process of creating a visual representation of a system's data elements & the relationship between them which helps in organizing & structuring data to ensure it is stored, managed & used efficiently.


My experience with data modeling & data management:


I have hands-on experience building conceptual, logical as well as physical data models for databases (OLTP systems) such as MySQL, MS SQL Server & PostgreSQL & data warehouses (OLAP systems) such as Teradata & Snowflake for companies across finance, ed-tech & automotive domains.


  • Database data modeling: Databases are designed to support day-to-day operations & transactional processing. In this case, we emphasize data normalization to reduce redundancy & ensure data integrity. This kind of structure typically involves a large number of tables with complex relationships (e.g. one-to-many, many-to-many) and are optimized for fast read & write operations to handle frequent small transactions.
  • Data Warehouse data modeling: Data warehouses are designed to support analytical processing & decision making. In this case, we emphasize data denormalization to improve query performance & simplify data retrieval. This involves consolidating data into fewer tables, often using star or snowflake schemas. This kind of structure typically involves fact tables (containing measurable data)  and dimension tables (containing descriptive attributes related to the facts), and are optimized for complex queries and large-scale data analysis, often involving aggregations & joins across large datasets.

◘ DATA PIPELINE DEVELOPMENT

A data pipeline encompasses the entire flow of data from source to destination, including ETL, data validation, enrichment & real-time processing. It typically involves four components - ingestion (collection of data from various sources), processing, storage and delivery (or distribution of data to various endpoints such as dashboards, applications, or other systems).


My experience with data pipeline development:


  • I have built & automated ETL processes using tools like SSRS & Talend to extract data from transactional databases and flat files, transformed data by formatting & standardizing data, and improving data quality and loading into databases such as MS SQL Server.
  • I have also built end-to-end automated data pipelines that merged data from web analytics tools such as Google Analytics & transactional databases and loaded into cloud data warehouse structures by utilizing AWS tools such as AWS Appflow (for data ingestion), AWS S3(for intermediate data storage) AWS Lambda(for transformations and real-time processing), and AWS Redshift(for final storage).


These experiences have equipped me with a strong understanding of scalable, reliable, and efficient data pipeline design—ensuring seamless data flow, improved data quality, and faster access to insights for business decision-making. 

◘ API PROGRAMMING

Web APIs are used for a variety of use cases such as integrating different software systems to share data, accessing real-time data from external sources such as weather services, financial markets or social media platforms, automating repetitive tasks by integrating with workflow automation tools, enabling developers to build applications that leverage your system's data & functionality such as mobile apps or web services.


My experience with API programming:


I have hands-on experience working with RESTful APIs in Python to extract and integrate data across platforms. For instance, I utilized APIs to pull data securely from Intralinks, a document management system, to enable seamless and secure integration with other enterprise systems. This not only streamlined data workflows but also ensured compliance with data access and governance standards. Through such projects, I’ve gained practical expertise in handling API authentication, parsing JSON/XML responses, and embedding API-based automation into broader data pipelines. 

◘ AUTOMATION & PROCESS IMPROVEMENT

Automation & process improvements involves streamlining workflows, reducing manual efforts & enhancing efficiency through technology-driven solutions.


My experience with automation & process improvements:


  • I’ve optimized SQL queries, applied column indexing, and automated ETL pipelines using tools like Talend and Airflow, leading to a 75% decrease in report generation time. 
  • I also automated the migration of 200+ datasets from legacy systems to modern storage solutions using Python, Bash, and Shell scripting, reducing migration time by 50% while ensuring compliance with HIPAA and SOX. 
  • I established a formal workflow for managing data changes using a ticketing system to gather requirement documentation & tools like Alation & Collibra for downstream impact analysis via data lineage. 

◘ PROJECT MANAGEMENT

Project Management has been a key part of my professional growth, enabling me to deliver high-impact data solutions while coordinating across teams and timelines.

My experience with project management:

  • At Hansa Cequity, I led multiple teams across simultaneous client projects, honing my skills in Agile methodology and using JIRA to manage deliverables, track progress, and communicate effectively with stakeholders. I also collaborated with the sales team to prepare project proposals, offering realistic estimates of resources, time, and effort for new and innovative engagements. 
  • At Numen IT, I managed the parallel migration of over 200 datasets while coordinating with global stakeholders, using Asana to track tasks and share real-time updates. 
  • At Crocs, I used JIRA for task management and further developed my expertise in Smartsheets to create and maintain quarterly project plans, ensuring smooth execution and transparency across initiatives. 

◘ TEACHING & MENTORING

Teaching & mentoring is a fulfilling extension of my professional journey, allowing me to share knowledge, empower others, and strengthen my own understanding through collaboration. 


My experience with teaching & mentoring: 


  • At Hansa Cequity, I led a hands-on Data Warehousing course for junior analysts and interns, focusing on data modeling and SQL for Teradata. 
  • My passion for teaching continued at the University of Arizona, where I served as a teaching assistant for both undergraduate and graduate-level courses. I conducted lab sessions for the undergraduate course ‘Database Management Systems,’ teaching SQL fundamentals, and supported two graduate courses: ‘Enterprise Data Management,’ where I taught ER modeling using Visio and advanced SQL in Oracle, and ‘Business Intelligence,’ where I helped students build impactful visualizations in Power BI and develop data warehousing solutions  


These experiences have reinforced my belief in the power of mentorship and continuous learning. 

◘ CERTIFICATIONS ◘

  • SAP :  I pursued certifications in SAP ERP, S/4HANA, and Materials Management to strengthen my understanding of master data management within SAP systems. These courses not only enhanced my technical knowledge of SAP’s data structures but also provided valuable insights into operations and supply chain processes, equipping me with a deeper appreciation of how enterprise workflows are executed and optimized in real-world SAP environments.   This foundation significantly helped me better understand and manage product and finance master data at Crocs, enabling more effective data quality and governance initiatives.


  • Alation: I earned certifications in Alation, a leading data catalog platform, to strengthen my ability to support and drive data governance initiatives at Crocs. This training enhanced my skills in data stewardship, metadata management, and data lineage analysis, enabling more informed and compliant data practices across the organization. 


  • AWS:  I obtained AWS certifications to gain a comprehensive understanding of the cloud ecosystem and its suite of services, enabling me to design and build robust, scalable, and fully automated data pipelines on AWS. 


Link to view certificates: Certifications




Copyright © 2025 Aishwarya Sasane - All Rights Reserved.

This website uses cookies.

We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.

Accept