Data Science Scaling | Guidance on AI/ML- Practitioner viewpoints for young aspirants | Making a Beginning | Becoming a Lifelong Learner (Part 3)
Mentoring early aspirants to data sciences is a nourishing and a daunting task. Today with a plethora of resources (courses, code, etc.), in my opinion there are lots of skipping of steps. The need to follow a disciplined approach is critical. The following article is intentionally written in a simple style with subheadings of relevant areas.
With technical advances fueling Artificial Intelligence(AI), mentorship and guidance play a vital role to ensure the effective journey for a young aspirant. The following post attempts to break this down into perspectives and? steps. This post will constantly evolve basis the availability of relevant content.
AI/ML/Data science is hard, it requires a consistent and planned approach to arrive at relevant solutions; along with lots of hard work, humility and collaboration !
?Identify your focus areas and specialization
AI is a vast technical domain. It has many cross domain areas that are critical. Some suggestions to embark on this would be.
Basics of Data Science Scaling?
The following series of articles provide a good foundation to what it means to build for scale .
Stand-alone AI/ML will not scale
Technology and software will play a vital role. Identifying your role in all this is very key and will help you shape the journey.?
Understanding how to build end to end and how it can be consumed in responsible ways is key.? This is a cross domain? realm of data science, software engineering, enterprise architecture with business architecture.
Domain knowledge / Problem-solving Perspectives
Data is the key to any relevant data science. This has many layers such as the following.
领英推荐
What else to consider Enterprise Architecture (EA) / Data Architecture /etc. ?
The EA community (technical frameworks, design patterns- GoF, Python design Patterns, etc.) has been built a number of these high quality frameworks. A practitioner led approach will be key with the following anchors. These frameworks and patterns are very critical for end to end designing.
1. Data architecture- granular, comprehensive, robust and scalable.?
1.1 Factor reuse and data stewardship?
1.2 Data models custom and industry standard
1.3 Data diversity forms structured and unstructured
2. AI integrated architecture that is modular and plugin oriented for scalability and enabling pipeline/end to end consumption?
3. Open source/ technology framework for technology middleware
4. Open source/ technology for data science/ ML/AI,NLP, data engineering?
5. Near/Real-time, streaming and batch integration architecture
With this orientation please proceed to link which takes you to relevant reading