Experiences Along an Analytics Journey: A Look at 3 Case Studies

Virtual Breakout Session

Tuesday, June 9, 2020
2:00pm EDT
Loading Sessions

Register now!

This is a consume-at-your-own-pace event, it will take place 100% online.  All of the content has been pre-recorded and will be released at its scheduled time. If you can’t listen or watch a session when it’s released, remember that you can access it at ANY TIME in the next 30 days.

  • Listen to podcasts
  • View webinars
  • Watch videos
  • Access articles, white papers, case studies and research reports
  • Chat with speakers and attendees via discussion forums
  • Visit sponsor profile pages

With analytics, organizations can find, and exploit patterns contained within data in order to detect risks, opportunities, and actionable insights. Models can be designed, to discover relationships between various behavior factors. Join us as we share our analytics journey by exploring three case studies.

Feel free to ask questions in the discussion forum below - speakers will be responding as quickly as they can. View this webinar by the end of the day on June 9 and get entered to win a $100 Amazon Gift Card.

Speakers:
Sponsored by:

Discussion

8 Comment threads
5 Thread replies
0 Followers
 
Most reacted comment
Hottest comment thread
newest oldest most voted
Joselle Barnett
Member
Joselle Barnett

I may have missed it but what analytics software was used in order to derive these use cases?

Carol Bogacz
Member
Speaker
Carol Bogacz

Joselle, each used different software. The Oklahoma University case study used SkySpark, SkySpark analyzes data from automation systems, meters, sensors and other smart devices to identify issues, patterns, deviations, faults and opportunities for operational improvements and cost reduction. The ODEC Plant Performance predictor used a combination of Azure Machine Learning Studio, R, and Power BI to visualize the results. The third case study, Predicting Reactive Power Demand, used SAS to perform the analysis and SAS Visual Analytics to visualize the results.

Hunter Ramirez
Member
Hunter Ramirez

Carol, great information here! What are some common strategies you use to gain insights from the data?

Carol Bogacz
Member
Speaker
Carol Bogacz

Thanks, Hunter! There are many different strategies but some of the important things to remember are:
• Filter the data to cut the noise and focus on the most interesting topic
• Summarize and segment different groups if possible
• Bring the data to life through visualizes
• Focus on trends not data points
• Search for strong relationships

Tom Johnson
Member
Tom Johnson

What is the frequency of data normalization tasks – is it ongoing based on anomalies or periodic based on data based values?

Gaylen Kirkwood
Member
Gaylen Kirkwood

Would really like a set of the slides to use/plagiarize/borrow. thank you gkirkwood@joemc.com

Carol Bogacz
Member
Speaker
Carol Bogacz

Thanks, Gaylen. I will reach out to you via email.

Gaylen Kirkwood
Member
Gaylen Kirkwood

For the ODEC project, you said you had a LOT of historical data. How much would you have considered actually valuable? I am working anew, and am finding that 20+ years of invoices, member data, etc. may be too much. How do you make the decision how much is enough?

Carol Bogacz
Member
Speaker
Carol Bogacz

Gaylen, the amount of data often depends on the question being asked. For instance, we were predicting 7 days ahead. We wanted to use enough data to capture any changes that may occur over time due to climate, temperature, and location.
In your data, what are you trying to gain from the data? Is there a particular question, you are trying to answer?

Rakesh Vaidyanathan
Member
Rakesh Vaidyanathan

What analytics approaches are used in bidding for transmission line concessions ?

Carol Bogacz
Member
Speaker
Carol Bogacz

Rakesh, can you elaborate a bit more on your question?

Farah Hassan
Member
Farah Hassan

Excellent presentation Carol! How did you use data analytics for Finance department

Farah Hassan
Member
Farah Hassan

Do you impute missing data with an average number or do you use a predefined number?

Related sessions

No related sessions currently available