Day Two at re:Invent ??
Day two started out with an excellent Keynote from Adam Selipsky which was jam packed with exciting announcements, we will be sharing some thoughts later in the week reflecting on what some of those announcements mean for us here at Gemba.
Lucas Dobinson started his day reflecting on two data engineering centric sessions from his first day at re:Invent ??
Arran Hartgroves MEng CSP-SM CSP-PO delved into some deep graph modelling learning at. DAT317 | Graph modeling best practices with Amazon Neptune
"Graph databases seem to be a popular and evolving area, and that converging with cloud scale puts Neptune as a key offering to get to grips with going forward. Not only for our software developers, but our Product Thinking community who have less of a gap, in my opinion, between their analytic business work and what would end up in Neptune as a Graph. It also offers a path for our more people and process colleagues to dip their toe into the solution space and a new technology e.g. traversing a complicated graph to answer business questions with some OpenCypher."
Kim Reid has been exploring what Jupyter AI at OPN203 | Jupyter AI: Open source brings LLMs to your notebooks and was left with some great ideas about how she can implement Jupyter!
"I went to a session about a recently developed Jupyter extension, Jupyter AI and I walked out genuinely excited to tell everyone about it! Firstly, great speakers - knowledgeable and clearly passionate about their product which made listening effortless. Secondly, the content - wow! I thought the AI aspect would be gimmicky or unusable but this extension provided support for coding with responsible, transparent, data secure and accessible generative AI even within restricted/isolated environments (BIG WIN!). I loved the safety reminders to users about the origin of information (AI generated, specific LLMs etc) and the corresponding tagging mechanisms; tackling growing concerns about misleading AI content. The consistent design language throughout the extension would fool you into thinking it was part of the original Jupyter offering and made for a satisfying demo. So, what does it offer? Entire notebook generation, debugging support, coding suggestions and modifications through Jupyter magic commands or the chatbot option (Jupyternaut). How can it help me as a big data platform owner? Encouraging hardware sympathetic coding on big data platforms, easing the onboarding process for tradecraft creators and reducing operational support. I can’t wait to take this one home and start integrating!
One of our engineers scooted over for the all day bootcamp TNC213 | Practical data science with Amazon SageMaker.
"This was an interesting session that gave a good overview of machine learning in general as well as demonstrations of how to implement it using Amazon SageMaker. My immediate takeaway was that it was useful to see things from a data scientist’s perspective. My role involved dealing with data ingestion and I have learned things that I can take forward when thinking about the data quality requirements for machine learning."
领英推荐
Chris Mallon spent part of his day two at re:Invent learning more about event detection at ANT401-R | Event detection with MSK and Amazon Managed Service for Apache Flink
"Upshot was that they crammed a lot of cool stuff together into a 2 hour package that didn’t feel rushed but at the same time gave you enough to think about. They had you walk through setting everything up so you got an idea of how the components worked together and there was some Brucey bonuses in there with them letting you play with stuff like Apache zeppelin on the side (notebook interface that can work with Flink). Really great workshop. I suspect it’ll be very relevant for future atlas work even if we won’t be using flink directly on AWS. Also hats off to fictional user Karl Coronel who my layout managed to spot was spending thousands of dollars every few minutes."
Ewan-James Thomas headed down to Caesars to his session on incident response, BIZ203-R1 | Bolstering incident response with AWS Wickr and Amazon EventBridge
"This was an insightful talk on how SOC teams or DevSecOps teams can communicate in a more secure way, and how they can get updates on the most urgent issues in their AWS environments pushed straight to a chat client that’s available on iOS, Android, MacOS, Windows, and Linux. What I found interesting was how AWS Wickr secures comms—each device/endpoint has its own key for a given chat and that key is stored only on that device/endpoint. This means that even if your network is compromised, you can continue to chat through these devices since keys are stored on device and not on KMS, the devices themselves would have to be compromised too in order for an adversary to decrypt the messages. You can even (optional) configure headless clients in order to save messages for data-retention purposes. The talk really made me feel that AWS Wickr could be beneficial to our security team on project.
Finally, Christopher Pantelli spent some time learning what AWS Amplify can offer at FWM314 | Accelerate web and mobile development with AWS Amplify.
"As our company grows, the requirement to build small internal tools follows. Today, I joined a session discussing how we can leverage AWS Amplify to quickly build and deploy web apps. The recently announced AWS Amplify Gen 2 allows us to integrate our existing identity provider so we can utilise our user directory and permissions groups, enhancing the efficiency and security of our web applications in minutes."