September 25, 2023
Kannan Subbiah
FCA | CISA | CGEIT | CCISO | GRC Consulting | Independent Director | Enterprise & Solution Architecture | Former Sr. VP & CTO of MF Utilities | BU Soft Tech | itTrident
Beyond quality and efficiency, computer vision can help improve worker safety and reduce accidents on the factory floor and other job sites. According to the US Bureau of Labor Statistics, there were nearly 400,000 injuries and illnesses in the manufacturing sector in 2021. “Computer vision enhances worker safety and security in connected facilities by continuously identifying potential risks and threats to employees faster and more efficiently than via human oversight,” says Yashar Behzadi, CEO and founder of Synthesis AI. “For computer vision to achieve this accurately and reliably, the machine learning models are trained on massive amounts of data, and in these particular use cases, the unstructured data often comes to the ML engineer raw and unlabeled.” Using synthetic data is also important for safety-related use cases, as manufacturers are less likely to have images highlighting the underlying safety factors. “Technologies like synthetic data alleviate the strain on ML engineers by providing accurately labeled, high-quality data that can account for edge cases that save time, money, and the headache inaccurate data causes,” adds Behzadi.
Five years on, “the European regulation has inspired data protection around the world and many countries have put privacy standards in place. These include countries in South America such as Argentina, Brazil, and Chile, and in Asia, such as Japan and South Korea. In Australia, the Privacy Act has been in place since 1988, but was recently amended to mirror GDPR concepts. GDPR has also had a strong influence in the US where several states introduced data protection legislation, including California with the California Consumer Privacy Act, and Colorado with the Colorado Consumer Protection Act. On a federal level, the draft American Data Privacy and Protection Act is another example of where regulation is heading.” So what impact has it had on how organisations are run and data is handled? Aditya Fotedar, CIO at Tintri, a provider of auto adaptive, workload intelligent platforms, explains that while GDPR has ushered in significant changes, they are built upon existing regulations: “GDPR was a progression on the existing EU privacy laws, main changes being the sub processor contractual clauses, right to forget, and size of the fines.?
Companies are increasingly realizing the immense importance of a paradigm shift towards Privacy by Design. This is because this approach significantly reduces the cost of adapting to new legislation, builds consumer trust, and carries fewer risks. Data protection is here to stay, and this is a realization that everyone – from companies to legislators to consumers – is becoming more and more aware of and acting upon. The important thing now is to approach data protection more proactively – and to make it a general corporate responsibility. Data protection rights are also human rights! So far, the advertising industry has viewed data protection as a drag, but this perception will have to change as we move through2023. After all, data protection is no longer a limitation, but a selling point. As a result, industry players are beginning to view it as a worthwhile investment rather than a cost. Companies are doing this proactively because they want to stay competitive and keep their brand privacy-centric, and to ensure that customers continue to trust them.?
领英推荐
Moving storage to another location means disconnecting on-site storage resources, such as SANs, NAS devices, RAID equipment, optical storage and other technologies. But how likely is it that an IT department making a push to cloud storage clears out the storage section of its data center and makes constructive use of the newly empty space? Not always likely, and the organization is still paying for every square foot of floor space in that data center. Assuming IT managers performed a careful, phased migration from on site to the cloud, they probably would have analyzed the use of space made available from the migration. If the company owns the displaced storage assets, managers must consider what happens to them after a department or application moves out of the data center. From a business perspective, it may make sense to retain these assets and have them ready for use in an emergency. This approach also ensures that storage resources are available if cloud data repatriation occurs, but it doesn't save space -- or money. Continual advances in computing power can mean that repatriation may not require as much physical space for the same or greater processing speeds and storage capacity.
Am I engaging people on the front lines to formulate DX plans? According to Rogers, the answer should be yes. “You need people on the front lines, because it is the business units who have people out there talking to customers every day,” he says, adding that while C-suite support for transformation is crucial, the front-line perspectives offered by lower-tier employees are those that can identify where change is needed and can truly impact the business. ... Am I identifying and using the right business metrics to measure progress? Most CIOs have moved beyond using traditional IT metrics like uptime and application availability to determine whether a tech-driven initiative is successful. Still, there’s no guarantee that CIOs use the most appropriate metrics for measuring progress on a DX program, says Venu Lambu, CEO of Randstad Digital, a digital enablement partner. “It’s important to have the technology KPIs linked to business outcomes,” he explains. If your business wants to have faster time to market, improved customer engagement, or increased customer retention, those are what CIOs should measure to determine success.
As cloud complexity and maturity grow, the goal for businesses should be more than just “lift and shift’’ scenarios, especially when such migrations can result in higher costs. The key is understanding how to unlock the real value of cloud services to meet specific organizational needs. For example, with a clear view of how a vendor’s PaaS and SaaS strengths map to business objectives, organizations can release new features, cut costs, and gain powerful new capabilities to support long-term outcomes using predefined ML models. Success demands that systems be continually evaluated to seek out iterative improvements not be considered a one-off implementation. After all, technology is constantly evolving so there’s no room to be complacent or ignore the environment in which infrastructure operates. This is where human insight and expertise play a crucial role. For example, consider the matter of determining the right public or private cloud vendors for the business. Companies operating in highly regulated regions will need to consider how a cloud vendor can ensure data is compliant to localized regulations.