Initial Screening Interview : Software Development Manager role
Ranjeet Bhargava ?
SWE /Data Delivery Lead & CSA | Strategic Development for SW Projects- Disciplined Agile Practitioner -Tech Lead - DevOps SAaaS PaaS, IaaS, Data Platform .Net Modern Apps || Ex-Optum{UHG} || Ex-Microsoft
Interviewer : Ranjeet, let's dive in. To start, tell me about a challenging project you've led and how you managed your team to success.
Me : okay i'm doing one kind of prediction engine development with my current team
I am working with a company named as UrNext Technologies and I am working as a solution
cloud solution architect role as well as the leadership engagement role.
Interviewer : Got it.So
Me : the challenging here like we we have to cater the dissimilar
of different different databases the source one and the transformation
Interviewer : Nice mix
Me : transformation of data from source to usable data further or a which predicts an engine data is a challenging and taking the insides for the getting the forecast of The kind of values we are expecting was the main core challenge other than that while we are connecting for their This is this is basically for the prediction engine for the
for the credit card users which is having a DOS country user platform. So, DOS country based the main challenge was there. Time based challenge, time zone challenge and the usability challenge for that case.
So this was the thing and overall we find figure out the common cross connections between dissimilar databases and we develop, we do the
Well...
Interviewer : go ahead
Me : Extract load and transformation in this case, not the ETL, it's a ELT kind of things, yeah.
Interviewer :Sounds like you navigated a tangled web of data and timezone challenges, especially across such varied databases and user bases! By opting for extract, load, and transform over the usual ETL, you seem to have more effectively adapted to the complexity of the task. How did your team manage the performance aspect of handling all that cross-country data?
Me : Yeah, so we have
Mmm.
Developers on the That was the main challenge why we opted to develop some integration APIs to take the source third-party data and these integration APIs was basically, I can say, the data austerization platform-based APIs where we have used Azure Data Factory for overall this transaction need to be followed in a successful way.
Because there was some some countries data when we are trying to bring and the data format
the timings and other things was required more scale up for the API side.
So, we failed to develop one custom API development which was based in the .NET core and that custom API was the particular like there was five different segments of cross-country requirement. For them, we developed the three different APIs, two were the common things which we are able to cater within the same API but it's kind of like extensions of API, not the full-fledged APIs.
Interviewer : Smart move using Azure Data Factory and custom .NET Core APIs! Sounds like you skillfully navigated the tricky waters of data standardization and scalability. How did you deal with any timezone discrepancies among the different countries' data?
Seems like your ability to adapt and innovate really paid off in this project. Customizing APIs based on .NET Core and leveraging the Azure Data Factory must've given you quite a bit of flexibility. How did you find the balance between customization and maintainability for your team?
Me :Yeah, that's a good question. Actually in between I miss some of your lines, but what I got from you you are expecting
me to explain how do I
Manage the ?
Interviewer :Yeah, exactly! How did you balance customization with maintainability in your API development?
Me : team's challenges in between like their work process and other things we opted the pure agile model okay so I can say there is a discipline agile practice we opted not the pure agile so we sometimes like in a prediction engine development it is time taken process the tuning the data and collecting their time taken process so we go as per the hybrid model and we have overall
We started.
the whole environment is based on the managed identity of Azure managed entity so developer and DB side and dev side both are the part of same managed entity and that's how we used to track and overall the API development was based on the Azure DevOps CICD pipeline so we have sorted this one in a just two iterations. First we go for the dev release I mean say from local dev to dev release and then dev to QA release third one is the direct production we don't have a pre-production launch because in a QA release itself I can get kind of free production environment we opted
So...
That's all of the thing and
The CICD pipeline is based on Azure DevOps, so it was quite easy to interact.
what is the current flow and
burnout chart and overall the progress through the DevOps dashboard itself.
Interviewer :Smart move pivoting to a hybrid approach. Using Azure's Managed Identity likely helped with security and Nice approach with discipline agile practices and Azure DevOps! Sounds like the blend of managed identities and clear dev-to-production flow has been key to making it all manageable. What do you enjoy most about this leadership and problem-solving role?
Me :It's a quite a challenging rule.
While we connect with the typical
Data science guys and typical dev guys, dv guys, they always come up with their own boundaries and we have to, I mean being a
the project lead and team manager role, we have to break that bandwidth to interact with the other domain.
Interviewer : It really sounds like you've navigated the
Me :It was a quite a challenging role but now the technology boundaries are breaking down and all things are happening.
I know I can say overall it's a good experience.
And still continuing yeah, we are really still moving from one
I mean say right now we are having the one customer. I cannot disclose the customer name. So that customer is having I mean say that
that particular Vendors have more than 1000 merchant users and basically these merchant users have their own portals, their own e-commerce sites. Where we are embedding this API, the prediction based APIs and the normal customers are getting benefit with that.
So, now we have a one major client, then we are moving to second client and it seems like the for the second thing, the things would be more easy, so the development would be also easy because the first lesson, first kind of indication, API's development and this DOP platform have given us the one lesson learned which we can utilize definitely for the next.
the customer's delivery.
So that's all I can say.
Interviewer :Well, sounds like you've got a solid learning curve and a fascinating project! I'm sure each new client will bring its own set of challenges and opportunities, but it sounds like you're well-prepared to tackle them. What are you most excited about as you continue expanding this project to new clients?
Me : Yeah, we...
We already know that these things are in a placement and top level.
Leadership is also in that one and also we going for the certain events to So give us the showcase for to bring the bigger players in this I may say the this protection engine should be go as a good kind of product based
and we are also targeting some other kind of industry where the similar use case can be opted and
Yeah, so these all things are in pipeline and it's based on the top leadership how they are going to grow.
like this.
Interviewer :Ambitious goals! Taking the prediction engine to new industries could open up so many possibilities. I'll be rooting for you and the team to keep pushing boundaries. Kudos on your leadership, Ranjeet! Anything else you'd like to share about your professional journey?
Me :Ok, thank you, thanks a lot.
Thanks a lot for nice questioning.
and the smoking of the desires. Thank you.