Part 6: Engineering challenges - Connectors
Now that we’ve discussed what the next generation Identity Management (IdM) platform should look like and what needs to be built, let’s dive into the technical challenges that come with implementing connectors. This section is quite technical, so if you’re expecting a comparison of products or pricing, you may want to skip ahead.
Connectors are the backbone of any IdM system, facilitating the import and export of data efficiently. Let’s explore the hurdles a newly appointed engineering manager will face and why, in some cases, they might prefer to run the other way.
General considerations
While it’s common practice to implement unit tests with high code coverage, the real necessity for connectors is comprehensive end-to-end integration tests. Unit tests often cover basic functionality with controlled data but fail to simulate real-world challenges such as authentication failures, timeouts, performance degradation, or unexpected input (like HR systems repurposing data in unpredictable ways). Unit tests fail to simulate a change in behavior caused by connected system upgrades, reconfiguration, etc.
Integration tests need to target all supported systems, which presents a significant challenge: how do you configure and maintain these test environments? Most software developers (SDEs) lack the expertise to set up complex systems like DB2, OpenLDAP, DirX, or Domino in a realistic manner. Managing multiple versions of each system (for example, Oracle client 11g behaves differently from 8i; DB2 on NT is not completely the same as DB2 on AS/400), navigating licensing issues, and ensuring proper configuration in test environments are tasks that often fall outside the typical skill set of an SDE.
Given these complexities, many teams attempt to outsource this task. However, even when outsourced, the responsibility for system failures remains on the internal team. In many cases, the engineering manager will find themselves setting up virtual machines, configuring networks, and troubleshooting environments to get integration tests running smoothly. Being a manager, I had to put my hands on those VMs, containers, virtual subnets, set up NAT and routing, configure whatever had to be configured, fix what had to be fixed. I just needed a result, ASAP, and that was one of the options - DIY.
Despite the difficulties, integration tests are critical for ensuring the success of IdM connectors, though they are expensive and time-consuming to maintain.
Let’s talk about individual connectors now.
SCIM
SCIM (System for Cross-domain Identity Management) is a standard for identity management in cloud environments which seems to be more or less successful and becoming #1 in the cloud-world.
it’s essentially a JSON/REST-based connector with a predefined schema and some extensibility. While SCIM promises standardization, implementations vary widely.
For instance:
One of the biggest drawbacks is the lack of modern, passwordless secure authentication methods in SCIM protocol. Basic authentication and bearer tokens (often transmitted over public networks) remain common, though they carry security risks. Mutual TLS authentication helps but isn’t foolproof. Moreover, SCIM implementations are dependent on DNS and PKI for certificate validation, which can degrade performance in environments where access to external services is restricted.
Being dependent on DNS and PKI in terms of Certificate Revocation Lists (CRL) Distribution Points (CDP), you may face performance degradation due to extra calls (to CDPs) needed to validate presented certs to be not in a list of revoked certs (CRLs). Your operation system itself makes those calls and caches those lists; however, customers do very often disable caching (to be more secure) or block accesses to any external websites (to be even more secure) and the whole thing just breaks – not even mentioning air-gapped environments with internal certs, it’s the whole other world.
I’m yet to see ‘Proof-of-Possession (PoP) tokens’ used for authentication; however, those would require your client to be hosted in a cloud.
Despite these challenges, SCIM is a robust connector for cloud-based environments, but it requires constant updates and custom templates for each connected system.
LDAP
LDAP (Lightweight Directory Access Protocol) remains prevalent in on-premises environments, despite the rise of cloud-based services. While the basic functionality of LDAP is well understood, supporting various authentication mechanisms, schema extensions, and delta imports across different LDAP server implementations (like OpenLDAP or Active Directory) can be daunting
I won’t be talking about challenges to implement various authentication (binding) mechanisms. They are all very well known, from a simple bind with basic authentication, to Kerberos and mutual certificate-based authentication. All I can say is: hire an LDAP sysadmin, yesterday.
Going back to challenges, the main problem is not the implementation, but a variety of things to support. Challenges include:
领英推荐
So, the biggest challenge is not coding but having proper technical specs authored before you start.
I’m not mentioning password sync from AD DS (using password filters and services like MIM PCNS), I’m not mentioning password sync into connected systems (as for example OpenLDAP passwords could be hashed and salted with SHA instead of being sent as a plain text) – let the future be passwordless.
I’m not mentioning the need to implement 3-step user creation process for some directories: create a disabled users with an empty password, reset a password, enable a user – you will face all that later.
An LDAP sysadmin is essential for configuring these environments and ensuring they operate smoothly. Without one, managing LDAP connectors becomes a time-consuming and error-prone process.
ODBC/JDBC
Similar to LDAP, ODBC/JDBC connectors face challenges due to the variety of relational database management systems (RDBMS) and their unique requirements. Differences in SQL dialects, authentication methods, and schema structures can make it difficult to build a one-size-fits-all solution.
Challenges include:
What you may also expect is customers using your GSQL connector not in a way you expect, e.g. provisioning and manipulating AS/400 users by calling DB2 stored procedures that execute system commands – instead of buying an LDAP -> AS/400 gateway from a 3rd party vendor. Or using a main instance for export and a read-only replica for imports, having a 1-hour replication delay and complaining about export-change-not-reimported errors.
A knowledgeable database administrator is crucial for navigating these challenges, especially when supporting multiple versions of RDBMS in test environments.
PowerShell / Scripting
PowerShell and other scripting connectors offer flexibility but come with their own set of challenges.
The main catch here would be an approach to take. You can try to develop a generic PowerShell Connector that simply maps each ECMA2 method to a proper script. Developing a set of scripts to manage let’s say Lync users, won’t be way easier than a custom C# connector. Yes, a bit faster to develop (and almost impossible to troubleshoot and test, unless you’ve authored TestHarness.ps1 ) but still, you’ll have to understand the whole framework. The other approach could be to return simple PowerShell objects - GitHub - sorengranfeldt/psma: Powershell Management Agent for FIM2010 and MIM2016 and convert those PS Objects into connector framework objects during runtime.
Either way, you won’t face challenges other than authenticating to remote systems in various ways, impersonating a script running vs providing PS Credential object. And you’ll have to choose whether to store your scripts as a part of your connector config, or as a link to a local file. And timeouts. Those are real. A long-running PS script, establishing a remote PowerShell session and hanging for no reason (ok, there always a reason, something failed on the other side) is what I was facing supporting PS connectors.
In many cases, in-house IAM teams use PowerShell connectors to repurpose existing scripts, but this can lead to performance issues if the scripts are not optimized for production environments. Be ready to provide ‘commercially reasonable support’ for those cases.
Replace PowerShell with ZSH / KSH / BASH and you’ll get the same challenges.
Webservices
Web services connectors (REST/SOAP) offer extensive customization but can be challenging to implement correctly. Authentication methods, schema discovery, and import/export flow design are common pain points.
Reality is, it is too complicated to be used by small customers. And if we’re targeting consultants, let’s see what can go wrong here:
Building robust web service connectors requires not only technical knowledge of the target systems but also thorough documentation and ongoing updates to keep pace with changes in the external APIs
Having a previous experience with SAP R/3 (and other random things like DB2, AS/400, OpenLDAP, Sun One DSE, *BSD; Lotus Domino; once being a penetration tester, targeting AD DS and Oracle / Interbase DBs) it still took me a couple of days to author ‘Deploying SAP NetWeaver AS ABAP 7 ’ document to explain how you expose proper BAPIs as Web Services in SAP ECC and few more days to author ‘Authoring SAP ECC 7 Template for ECMA2Host ’ explaining how to create a Web Service connector template to implement basic user provisioning from Entra ID to SAP ECC . In a real world, you most probably don’t have a standalone SAP ECC instance with one mandate, but you have SAP CUA in-place with all systems connected, and you don’t just manipulate users, you have to manage: local and global activity groups, profiles, roles, systems and licenses, you have to implement delta tracking on the SAP side, you have to convert group:member into user:memberOf and so on.
Very often IAM vendors substitute connectors with templates in their marketing materials. So, when one says ‘We offer 1000 integrations’, most probably, it means just one Web Service connector with 1000 templates. And you will need a separate team to author those templates and keep them in up-to-date condition. The same applies to SCIM connector.
Conclusion
The technical challenges associated with building and maintaining IdM connectors are numerous. A strong team of system administrators, database administrators, and developers is essential for success. Without this support, even the most talented engineering manager will struggle to maintain reliable integration tests and production environments (and, most probably, will be trying their luck for no reason).
Engineering Manager | Microsoft | Identity Management Expert
4 天前Part 8 added https://www.dhirubhai.net/pulse/part-8-iam-people-management-challenges-eugene-sergeev-w5ssc
Engineering Manager | Microsoft | Identity Management Expert
2 个月Part 7 added: https://www.dhirubhai.net/pulse/part-7-consulting-challenges-why-do-most-rbac-abac-fail-sergeev-2ppzc/
IAM Engineer and Cloud Administrator
2 个月Fantastic article! Thank you for sharing!
Identity Management Solutions Architect
2 个月And also "the cloud" isn’t making all.of this disappear. Even when cloud applications authenticate against Azure, they often still need user profiles and role assignments set up in ways that go beyond JiT provisioning. Eugene, you've done a great job of capturing the complexity of the work we do. I don’t see that going away any time soon.
Microsoft Technology Specialist | MCSE Certified | Microsoft 365 Administrator
2 个月Very informative