Data Science Scaling | Guidance on AI/ML-  Practitioner viewpoints for young aspirants | Making a Beginning | Becoming a Lifelong Learner (Part 3)
Photo by https://unsplash.com/@pranamgurung || Pranam Gurung

Data Science Scaling | Guidance on AI/ML- Practitioner viewpoints for young aspirants | Making a Beginning | Becoming a Lifelong Learner (Part 3)

Mentoring early aspirants to data sciences is a nourishing and a daunting task. Today with a plethora of resources (courses, code, etc.), in my opinion there are lots of skipping of steps. The need to follow a disciplined approach is critical. The following article is intentionally written in a simple style with subheadings of relevant areas.

With technical advances fueling Artificial Intelligence(AI), mentorship and guidance play a vital role to ensure the effective journey for a young aspirant. The following post attempts to break this down into perspectives and? steps. This post will constantly evolve basis the availability of relevant content.

AI/ML/Data science is hard, it requires a consistent and planned approach to arrive at relevant solutions; along with lots of hard work, humility and collaboration !

?Identify your focus areas and specialization

AI is a vast technical domain. It has many cross domain areas that are critical. Some suggestions to embark on this would be.

  1. Understand the foundations at base level through foundation courses
  2. If keen on pursing further get into more hands on courses where you build (hands-on) some models and experience first hand. 2.1 Please note this must be end to end; expand beyond a given problem/dataset from sources (Kaggle, UCI Irvine etc.)2.2 Think about whether your aspiration is a hobby or a profession?; based on the need you need to tailor your approach2.3 In specific, start exploring if work has been done on this. Follow experts and research done
  3. Subscribe/follow leading experts and practitioners?(on multiple channels like social media (linkedin, etc.), blogs (medium, substack, etc.), podcasts (spotify, etc.)3.1 Have a structured flow to capture your learnings.3.2 Attend workshops, Webinars and join specialized groups3.3 Foundation understanding (mathematics,? statistics, computer science, etc.) need to be pursed at the needful levels for your requirement
  4. You will need to plan those journeys carefully and in a structured manner4.1 Again you can decide the intent based on the requirement of the pathway4.2 However, whichever pathway domain knowledge is key; being able to get into granular data depth will play a vital role in modelling

Basics of Data Science Scaling?

The following series of articles provide a good foundation to what it means to build for scale .

  1. Data Science Scaling | Bringing it alive? https://www.dhirubhai.net/pulse/data-science-bringing-alive-scaling-dr-shyam-sundaram?
  2. Data Science | Bringing Alive Scaling | Soft Skills? https://www.dhirubhai.net/pulse/data-science-bringing-alive-scaling-soft-skills-part-2-sundaram?

Stand-alone AI/ML will not scale

Technology and software will play a vital role. Identifying your role in all this is very key and will help you shape the journey.?

Understanding how to build end to end and how it can be consumed in responsible ways is key.? This is a cross domain? realm of data science, software engineering, enterprise architecture with business architecture.

Domain knowledge / Problem-solving Perspectives

Data is the key to any relevant data science. This has many layers such as the following.

  1. Sources of data
  2. Domain functional and granular data understanding
  3. End to end domain/ process life cycle
  4. Comprehensive data depth through hands on data exploration and understanding
  5. Comprehensive exploratory data analysis (EDA)
  6. Using a tool (excel) / BI tools ; build an explainable data storytelling is key
  7. Discussions and whiteboards technical are critical

What else to consider Enterprise Architecture (EA) / Data Architecture /etc. ?

The EA community (technical frameworks, design patterns- GoF, Python design Patterns, etc.) has been built a number of these high quality frameworks. A practitioner led approach will be key with the following anchors. These frameworks and patterns are very critical for end to end designing.

1. Data architecture- granular, comprehensive, robust and scalable.?

1.1 Factor reuse and data stewardship?

1.2 Data models custom and industry standard

1.3 Data diversity forms structured and unstructured

2. AI integrated architecture that is modular and plugin oriented for scalability and enabling pipeline/end to end consumption?

3. Open source/ technology framework for technology middleware

4. Open source/ technology for data science/ ML/AI,NLP, data engineering?

5. Near/Real-time, streaming and batch integration architecture


With this orientation please proceed to link which takes you to relevant reading





要查看或添加评论,请登录

Dr. Shyam Sundaram的更多文章

社区洞察

其他会员也浏览了