April 21, 2023
Kannan Subbiah
FCA | CISA | CGEIT | CCISO | GRC Consulting | Independent Director | Enterprise & Solution Architecture | Former Sr. VP & CTO of MF Utilities | BU Soft Tech | itTrident
It's a seamless blend of technology and human interaction that Humane believes can extend to daily schedule run-downs, seeing map directions, and receiving visual aids for cooking or when fixing a car engine -- as suggested by the company's public patents. The list goes on. Chaudhri also demoed the wearable's voice translator which converted his English into French while using an AI-generated voice to retain his tone and timbre, as reported by designer Michael Mofina, who watched the recorded TED Talk before it was taken down. Mofina also shared an instance when the wearable was able to recap the user's missed notifications without sounding invasive, framing them as, "You got an email, and Bethany sent you some photos." Perhaps the biggest draw to Humane and its AI projector is the team behind it. That roster includes Chaudri, a former Director of Design at Apple who worked on the Mac, iPod, iPhone, and other prominent devices, and Bethany Bongiorno, also from Apple and was heavily involved in the software management of iOS and MacOS.
Generative AI uses massive language models, it’s processor-intensive, and it’s rapidly becoming as ubiquitous as browsers. This is a problem because existing, centralized datacenters aren’t structured to handle this kind of load. They are I/O-constrained, processor-constrained, database-constrained, cost-constrained, and size-constrained, making a massive increase in centralized capacity unlikely in the near term, even though the need for this capacity is going vertical. These capacity problems will increase latency, reduce reliability, and over time could throttle performance and reduce customer satisfaction with the result. The need is for more of a more hybrid approach where the AI components necessary for speed are retained locally (on devices) while the majority of the data resides centrally to reduce datacenter loads and decrease latency. Without a hybrid solution — where smartphones and laptops can do much of the work — use of the technology is likely to stall as satisfaction falls, particularly in areas such as gaming, translation, and conversations where latency will be most annoying.
The first notable application is code improvement. Auto-GPT can read, write and execute code and thus can improve its own programming. The AI can evaluate, test and update code to make it faster, more reliable, and more efficient. In a recent tweet, Auto-GPT’s developer, Significant Gravitas, shared a video of the tool checking a simple example function responsible for math calculations. While this particular example only contained a simple syntax error, it still took the AI roughly a minute to correct the mistake, which would have taken a human much longer in a codebase containing hundreds or thousands of lines. ... The second notable application is in building an app. Auto-GPT detected that Varun Mayya needed the Node.js runtime environment to build an app, which was missing on his computer. Auto-GPT searched for installation instructions, downloaded and extracted the archive, and then started a Node server to continue with the job. While Auto-GPT made the installation process effortless, Mayya cautions against using AI for coding unless you already understand programming, as it can still make errors.
领英推荐
Gathering telemetry data can be a challenge, and with OpenTelemetry now handling essential signals like metrics, traces and logs, you might feel the urge to save your company some cash by building your own system. As a developer myself, I totally get that feeling, but I also know how easy it is to underestimate the effort involved by just focusing on the fun parts when kicking off the project. No joke, I’ve actually seen organizations assign teams of 50 engineers to work on their observability stack, even though the company’s core business is something else entirely. Keep in mind that data collection is just a small part of what observability tools do these days. The real challenge lies in data ingestion, retention, storage and, ultimately, delivering valuable insights from your data at scale. ... At the very least, auto-instrumentation will search for recognized libraries and APIs and then add some code to indicate the start and end of well-known function calls. Additionally, auto-instrumentation takes care of capturing the current context from incoming requests and forwarding it to downstream requests.
The Italian authority says OpenAI is not being transparent about how it collects users’ data during the post-training phase, such as in chat logs of their interactions with ChatGPT. “What’s really concerning is how it uses data that you give it in the chat,” says Leautier. People tend to share intimate, private information with the chatbot, telling it about things like their mental state, their health, or their personal opinions. Leautier says it is problematic if there’s a risk that ChatGPT regurgitates this sensitive data to others. And under European law, users need to be able to get their chat log data deleted, he adds. OpenAI is going to find it near-impossible to identify individuals’ data and remove it from its models, says Margaret Mitchell, an AI researcher and chief ethics scientist at startup Hugging Face, who was formerly Google’s AI ethics co-lead. The company could have saved itself a giant headache by building in robust data record-keeping from the start, she says. Instead, it is common in the AI industry to build data sets for AI models by scraping the web indiscriminately and then outsourcing the work of removing duplicates or irrelevant data points, filtering unwanted things, and fixing typos.
Many businesses are trying hard right now to stay profitable during these times of economic uncertainty. The startling takeaway to us was that business and technical leaders see cloud analytics as the tool -- not a silver bullet, but a critical component -- for staying ahead of the pack in the current economic climate. Not only that, organizations need to do more with less and, as it turns out, cloud analytics is not only a wise investment during good economic times, but also in more challenging economic times. Businesses reap benefits from the same solution (cloud analytics) in either scenario. For example, cloud analytics is typically more cost-effective than on-premises analytics solutions because it eliminates the need for businesses to invest in expensive hardware and IT infrastructure. It also offers the flexibility businesses need to quickly experiment with new data sources, analytics tools, and data models to get better insights -- without having to worry about the underlying infrastructure.
Next Trend Realty LLC./wwwHar.com/Chester-Swanson/agent_cbswan
1 年Thanks for sharing.