FAQ

This collection of frequently asked questions offers a global vision of the architectonic and functional features of TrustedX and KeyOne.

  • Adaptive Authentication and Federation

    Electronic authentication provides guarantees on the identity of an entity wanting to interact with an IT system. Depending on the characteristics of the authentication process, the identity (of a person or an application) is established according to a given degree of certainty (level of assurance).
    An authentication factor is an element of identity proof used in authentication processes. Examples of authentication factors include passwords, private cryptographic keys, biometric parameters, etc.
    There are three types of authentication factors: Those based on something that only the legitimate user knows, e.g., a password. Those based on something that only the legitimate user has, e.g., an OTP generation token. Those based on a personal characteristic distinctive to the legitimate user, e.g., keystroke dynamics or a fingerprint.
    Multi-factor authentication is any authentication procedure that uses two or more authentication factors to verify an entity's identity. You can generally obtain the maximum assurances on the identity of an entity by verifying two authentication factors of different types. This is known as two-factor authentication (2FA), an example of which is a private key in a hardware token (something you have) that you enable by entering a PIN (something you know).
    Strong authentication has evolved over time. It was originally any authentication method more secure than a static password. In other words, any method capable of resisting attacks that static passwords were vulnerable to (dictionary attacks, using keyloggers, shoulder surfing, social engineering, etc.). Currently, subsequent to the publication of the NIST's Electronic Authentication Guideline (NIST SP 800-63-1) and the ISO's Entity authentication assurance framework (ISO/IEC 29115:2013), strong authentication is considered any authentication procedure that allows verifying an entity's identity with a level of assurance equal to or higher than 3 (LoA3). Lastly, although strong authentication and multi-factor authentication are not exactly the same thing, in practice, any authentication process based on two or more authentication factors of different types (i.e., something you know, something you have and something you are) is usually called strong authentication. For instance, authentication based on a static password that the user knows by memory and a dynamic password (OTP) that the user generates with a hardware device.
    Single sign-on is when users can log in once and access a number of applications without having to resubmit their credentials on changing applications. Credentials are only submitted for the first login.
    Identity federation is the capability of using user identity attributes managed by third-party identity provider systems in an IT system. For example, using identity federation, you can access online shopping accounts with your identity for a social network. This facilitates managing identity attributes and user authentication in a distributed manner. Identity federation requires a trusted relationship between organizations that strictly observe the agreed protocols for registering users, managing identities and authentication.
    There is a wide range of authentication methods to suit the security, usability and cost requirements of different IT systems and user communities. In general, the username–password combination is the most widely used authentication method in electronic applications, mainly because of its usability and ease of administration. However, this method offers very low security and is vulnerable even to unsophisticated attacks. On the other hand, some methods offer a very high level of security but are far less intuitive for users and entail higher implementation and maintenance costs. An example of such methods is private keys stored on cryptographic cards. Applications handling sensitive information, such as online banking systems, are required by legislation to use strong authentication, e.g., authentication based on multiple factors.
    SAML 2.0 is an OASIS specification that defines an XML vocabulary and a set of rules for describing, requesting and sending information on the authentication, authorization and attributes of a user. The main uses of SAML 2.0 are single sign-on and identity federation at a global level, i.e., for scenarios where the applications and identity providers are distributed across the Web. TrustedX Adaptive Authentication provides SAML 2.0 as one of the protocols that applications can use to access TrustedX's adaptive authentication functionality to obtain assurances on the identity of users.
    Under the traditional client–server authentication model, when an application needed to access a protected resource located in an external server, the application had to submit the credentials of the owner of this resource to the server. On many occasions, this entity is not the application itself. Typically, it is a user of the application. This means that the application does not access the protected resource under its own name but rather using the name of the owner of the resource and, therefore, by assuming the identity of this entity. This approach presents major problems, including the inability to authenticate the application accessing the resource, the fact that the owners of the resources must share their credentials with the application (which compromises security), the impossibility to restrict the access rights of the application with respect to those of the resource's owner and the inability to selectively revoke these rights for a particular application. OAuth 2.0 is an IETF specification (RFC 6749) that approaches this problem differently. Under the OAuth 2.0 model, the client application uses credentials different to the resource's owner's to access the resource. These credentials are an access token, which an authorization server trusted by the resource server provides to the application with the authorization of the resource's owner. The application obtains the access token as follows. Firstly, the application obtains authorization to access the resource. This authorization grant, which is explicitly approved by the resource's owner who is authenticated by the server as part of the process, is issued by an authorization server. The application then provides two things to the authorization server: 1) its identity and 2) the authorization provided by the resource's owner. To provide its identity, the application authenticates with the authorization server with which it must be registered. To present the authorization, it submits the authorization credentials provided to it by the authorization server. When the application has successfully done both these things, the authorization server provides it with an access token that allows it to access the resource by presenting the token to the server hosting it. While OAuth 2.0 is basically an authorization protocol, it entails, as we have seen, authenticating the entity owner of the resource. TrustedX makes use of this authentication element to provide OAuth 2.0 as one of the protocols with which the applications can access the adaptive authentication functionality to obtain assurances on the identity of users. So, TrustedX uses OAuth 2.0 to authenticate users of applications registered as if they were the owners of a hypothetical resource while omitting OAuth 2.0 aspects that are not strictly necessary to carry out this authentication, such as the requirement to obtain approval from the user being authenticated.
    A one-time password (OTP) is a password that can only be used once, i.e., a password for authenticating the user for one session or transaction only before it becomes invalid. Therefore, users that use OTPs must submit a different password each time they authenticate Compared to traditional (i.e., static) passwords, OTPs have the advantage of being invulnerable to espionage. If a static password is discovered through espionage, the attacker can from then on use the identity of the legitimate user. In contrast, an attacker gains no advantage in discovering an OTP through espionage as the password becomes invalid as soon as it is used. OTPs cannot be predicted or calculated by users as occurs with static passwords, which are memorized and then recalled by users when they need to authenticate. With OTPs, users typically have a hardware or software device (token) they use to generate the OTPs they need. Furthermore, the user is often required to enter a PIN to enable the token, which provides a higher level of security. Using OTPs as an authentication method depends on the capability of the user token and the validation process to share a secret key and keep synchronized the values that each of them use that are derived from a given parameter (a changing factor). Either a counter that increases each time the user presses the token button (event-based) or the time given by an internal clock (time-based) is used for the changing factor. Because OTPs are generated using a secret key, they are considered a something you have factor. OTP authentication works as follows: The user has a token synchronized with the validation server. To authenticate, the users use their tokens to obtain the next OTP, which they send to the validation server. The validation server performs the same calculation and compares the result with the OTP received. The authentication is successful where the two results match and fails where they do not. The server can be configured to generate several OTPs and approve the authentication where any of them matches the OTP sent by the client. This behavior is necessary to resynchronize the user and validation server tokens in scenarios where they may become unsynchronized. There are currently two public-domain algorithms for generating OTPs that the IETF provides for informational purposes that were designed by OATH (Initiative for Open Authentication). These algorithms are HOTP (RFC 4226) and TOTP (RFC 6238). For the HOTP algorithm, the changing factor is a counter of the number of times that the user clicks on the token button. For the TOTP algorithm, the changing factor is the number of periods of a certain duration (e.g., 30 seconds) that have elapsed since a given instant (e.g., beginning of the UNIX era).
    RADIUS (Remote Authentication Dial In User Service) is a protocol for accessing a centralized authentication, authorization and accounting service (AAA). This protocol is defined in RFC 2865 and RFC 2866 and is widely used by companies in their servers to provide access to their networks from the exterior (NAS servers, wireless access points, etc.). This allows users to connect to one of the servers and provide the credentials for a given protocol. The server, which acts as a RADIUS client, then extracts the credentials submitted by the user and sends them to a RADIUS server for them to be validated and, optionally, to determine which network resources the user can access. TrustedX can integrate with the RADIUS servers of organizations and make use of the capability of validating passwords that these servers provide. In fact, accessing a RADIUS server is one of the methods used by TrustedX Adaptive Authentication to validate the OTPs submitted by users in adaptive authentication processes that require the execution of the second line of authentication. TrustedX Signature and Encryption also uses this method to validate the OTPs submitted by platform users for logging in and accessing the platform's signature and encryption services.
    Phishing attacks aim to trick users into revealing the usernames and passwords they use to access systems so that attackers can fraudulently use the users' identities. A typical guise for this type of attack consists in sending fraudulent messages to users inviting them to log in to false websites that mimic down to the last detail the appearances of the authentic websites. Users that fall for this trick and try to log in to the false website, believing it to be authentic, are unknowingly revealing their usernames and passwords to the attackers.
    Pharming attacks involve illegally diverting the traffic a user sends to a given website to another with the same appearance in an attempt to capture the user's username and password. In contrast to phishing, pharming does not require the direct collaboration of the user to direct the user to the false website. It involves manipulating the system under which domain names are translated into IP addresses by, for example, replacing the hosts file located in the user's machine or providing false information to the DNS server that the user accesses (DNS poisoning). Another pharming strategy entails manipulating a local network router so that it provides the IP address of a fraudulent DNS server to the users.
    Risked-based authentication determines the level of assurance of the factors that the user submits in an authentication process to adequately compensate against the potential impact of a case of identity theft.
    Behavioral biometrics identifies users by how they carry out certain typical actions and tasks. The range of traits that can be observed include typing style, how someone walks and the order in which pages are visited when browsing through an application.
    In adaptive authentication, the level of assurance for verifying a user's identity varies depending on the risk of identity theft deduced from the user's activity or the criticality of the resources to which access is requested. When there is perceived to be a higher risk of identity theft or the criticality of the resources requested is higher, greater assurances are required from the authentication process, which ultimately means requiring more and better authentication factors from users. For instance, a simple four-digit PIN can be enough for users to log in to an online banking system if they do so at a usual time and from a usual place. However, if access is requested from a new location, a higher level of risk of fraudulent access is attributed to the transaction. In this case, in addition to the PIN, the user must correctly respond to one or more questions that only the user and the system know. Lastly, if the user wants to make a transfer, as this transaction has a greater criticality than checking balances and movements, the user is prompted to enter a code (an OTP) on-screen that is sent to the user by an alternative channel (e.g., via an SMS sent to the user's mobile).
    The main benefit of adaptive authentication is that, thanks to its flexibility, the ideal balance between security, costs and usability can be obtained. Furthermore, it gets rid of unnecessary complexity in the authentication process that amounts to an obstacle for users, which further promotes the use of the Web channel and reduces the costs involved. Until now, implementing an authentication system entailed a dilemma. You either favored simplicity at the expense of security and used the most economical and easiest authentication methods but with a low level of assurances (e.g., static passwords). Or, on the other hand, you chose security over simplicity and implemented complex and expensive authentication methods that had high levels of assurance (e.g., double-factor authentication using a hardware token requiring a PIN to be enabled that contained a private key or generated OTPs). In the first case, you either severely restricted the options for deploying resources and services via the Web channel or were exposed to unacceptable risks and the costs that these risks could give rise to (e.g., damaged reputation, fines and civil liability). In the second case, while theoretically the full potential of the Internet could be harnessed for providing services and resources, in practice, access via Web channels was severely underused owing to the reluctance of users to use the authentication methods required and to the logistical difficulties and high costs entailed in using these methods (e.g., cost of support for helping users to correctly use the methods and cost of purchasing the devices used in the methods). Adaptive authentication allows providing the optimal point between these two extremes in each situation. When the risk of identity theft is low or the operation is not critical, both of which are determined without affecting the user experience, it is entirely acceptable to use an authentication method with a relatively low level of assurance that is economical and easy to use. On the other hand, when there is a high risk of identity theft or the user wants to perform a critical operation, using an authentication method with a high level of assurance is advisable even though it may be more complicated to use. As it tailors the security level provided to the risk level detected, adaptive authentication maximizes usability in the authentication procedures used and minimizes the total cost of implementation. Widespread use of the Web channel is in itself a cost reducer. This use, and the associated reduction in costs, has been, in many cases, impeded up until now because of the reluctance of users to use complicated authentication methods that they did not understand or that required technical knowledge beyond what can be learned through daily use. Adaptive authentication allows, in many cases, users to authenticate using very simple methods that do not entail an unacceptable reduction in security. So, deploying an adaptive authentication system can remove the reluctance that, until now, many users have had to using Web channels.
    TrustedX Adaptive Authentication is an authentication platform for Web and Cloud environments that has the following characteristics: It provides an adaptive authentication service based on validating conventional authentication factors (passwords, OTPs, etc.) and transparently observing the context in which the authentication process takes place. So, depending on the characteristics of this context, it distinguishes different risk scenarios based on which it requires the validation of more or less authentication factors to accept the authentication assurances and, as a result, approve the identity of the user. It allows accessing the adaptive authentication service using the OAuth 2.0 and SAML 2.0 protocols. As a result, any Web application (service provider) that supports these protocols can use TrustedX Adaptive Authentication as an identity provider (IdP) and delegate user authentication to it. One of the special characteristics of the service is that it provides single sign-on (SSO) functionality to the applications federated with TrustedX Adaptive Authentication so that it can be used as an IdP. It integrates with the identity repositories (databases, LDAP) and RADIUS authentication servers already deployed in the organization. It can also be federated with IdP systems already being used and, as a result, make use of their capability for validating credentials and providing identity information. Using TrustedX Adaptive Authentication does not require making any changes to the identity management and credential validation processes already being used by the organization. Users can register the devices (PC, smartphone, tablet) they use to provide their credentials and associate to each device a customized image. Subsequently, through observing the authentication context of each access, TrustedX Adaptive Authentication detects when the registered devices are used and provides the corresponding image to the HTML page that the federated application uses to capture the user credentials. So, only when users are about to submit their credentials to TrustedX Adaptive Authentication are their customized images displayed in the browser. This constitutes an intuitive server authentication method that lets users easily distinguish between the legitimate server (customized image is displayed) and a fraudulent server in a phishing of pharming attack (customized image is not displayed) when prompted for their credentials and, as a result, avoid the attack.
    The authentication context is the elements making up the environment in which the user provides credentials for authenticating. These elements are known as context factors. By verifying these factors, the risk of identity theft to which the authentication is exposed can be assessed. For instance, the context of a username and password authentication includes as factors the device, the access location and time, and the characteristics of the keystroke dynamics with which the credentials were entered. Checks that can be done on these factors include verifying that the device, location and connection time are all usual for the user to which the credentials correspond and that the keystroke dynamics match the pattern the user had registered previously.
    TrustedX's authentication context analysis service assesses the risk of identity theft in an authentication process based on submitting credentials of a certain type (e.g., username and password) by observing the factors of the context in which these credentials are submitted. Based on the risk assessment and the level of assurances provided by the credentials, the service recommends either approving the user's identity or, if the service considers the risk of identity theft to be excessive, requesting that the user corroborates the identity given by providing an additional authentication factor. In other words, submitting another valid credential of a different type (e.g., an OTP).
    TrustedX's authentication context analysis service can currently validate (analyze) the following context factors: Registered device: Validating this factor entails checking that the device used by the user to perform the access was registered by the user as a trusted device. To register the devices, users must authenticate using a strong method (e.g., password and OTP). Registering devices as trusted devices is equivalent to users declaring that these devices are always under their control and in their possession. As a result, a lower level of risk of identity theft is attributed to authentication processes in which the user accesses using a registered device. Device history: Validating this factor entails checking, in the period defined in the configuration of the Access History, whether the user has performed one or more accesses with the same device detected in the current access. In short, a check is run to see if the user is accessing from a usual device. Keystroke dynamics: Validating this factor entails checking if the characteristics of the keystroke used to enter the username and password match the typing pattern registered in the system for this user and device during the training phase. Location history: Validating this factor entails checking, in the period defined in the configuration of the Access History, whether the user has performed one or more accesses from the same location detected in the current access. This location can be expressed as an IP address, city or country. This is a check to see whether the user is accessing from a usual location. Location change speed: Validating this factor involves calculating the travel speed of the user between the locations of the current and last accesses based on the time between these accesses. A check is then run to see if that speed is probable (i.e., that it does not exceed a threshold value). Device–location history: Validating this factor entails checking, in the period defined in the configuration of the Access History, whether the user has performed one or more accesses from the same location and with the same device detected in the current access. This check determines if the user is accessing from a usual location and, additionally, using a device normally used in that location.
    The adaptive authentication performed by TrustedX Adaptive Authentication works as follows: The user normally authenticates by username and password. This authentication is known as the primary authentication or the first line of authentication. It takes place according to an authentication policy. If the username or password is invalid, TrustedX concludes that the user is not the legitimate user and ends the adaptive authentication process. The user's authentication context is analyzed. In other words, the environment in which the username and password is provided by the user is analyzed. Checks are run, for example, to determine if the user is using a device registered as a trusted device, if the user is accessing from a usual location, etc. This analysis is performed according to a context analysis policy. If the context analysis determines that there are sufficient evidences in the environment for trusting that the user really is legitimate, TrustedX concludes that the user is, effectively, authentic and ends the adaptive authentication process. If, however, the context analysis determines that the user's authenticity is not clear cut, TrustedX concludes that an additional line of authentication should be requested from the user. The user authenticates, for example, using a one-time password (OTP). This authentication is known as the secondary authentication or the second line of authentication. It takes place according to an authentication policy. If the one-time password is valid, TrustedX concludes that the user is authentic. If the password is invalid, it determines that the user is not authentic.
    Users can select a personal image that is displayed in the form for entering credentials alongside the username and password fields. Phishing and pharming protection derives from the fact that the customized image is different for each device (i.e., for each browser in an operating system account). So, even if a false website can exactly replicate the legitimate credential input form, it cannot replicate the customized image the user expects to see as this image is different for each user (and, in general, for each device each user uses).
  • Electronic Signature and Encryption

    • TrustedX basics

      TrustedX is a Web services platform which provides mechanisms for handling the authenticity, electronic signature and data encryption. TrustedX is based entirely on policies, and once configured, it manages corporate trust automatically. TrustedX is used by corporative applications to provide security and optimize business process, automating transactions by means of electronic mechanisms and without the need for documents in paper form. Examples of applications are e-Procurement, e-Contracting, e-Invoicing, e-Banking, e-Government, e-Learning, etc. The platform resolves the complexity involved until now in endowing applications with security mechanisms using classic integration tools. The platform's Services-Oriented Architecture (SOA) and complete information management system simplify the integration of trusted mechanisms in business processes, making them independent of one another and offering the capacity of centrally managing security policies and auditing. TrustedX is a product of Safelayer.
      Toolkits offer a series of functions for building a functional logic embedded in the actual business processes. This must be performed for each of the processes that require the functions offered by a specific toolkit. Any minor change to the logic may require that the code be rewritten or, to avoid this, it may need complex configuration methods. TrustedX is based on common architectural principles-Web Services (WS) and Services Oriented Architectures (SOA)-to offer a homogeneous series of functions as a service ("sign an document", "verify a signature of a document", "encipher/decipher a document", "authenticate and authorise", etc.), so the complexity required by the business logic goes on to reside outside the applications. This simple idea has huge advantages: There is no need to recompile if new standards appear. All that is needed is to incorporate new components to TrustedX for the business processes to be able to use them immediately. The business logic can change at any time without having to change even one code line or recompile.  
      The TrustedX implementation architectures in an SOA architecture can be: Web services: the business process uses TrustedX services to protect or process data (request/response), obtaining an XML document with the results. XML gateway: the business process uses TrustedX as an XML security gateway so that it performs the functions of protecting sent data (request/sending) or processing received data (processing/reception). The TrustedX integration gateway will chain a successive series of XML processing rules (for instance: signing, enciphering, etc.) which will be executed to achieve the required data output. This architecture presents the advantage of being able to implement security mechanisms without modifying the web services. A combination of the two.
      TrustedX is a web services and SOA architecture platform that provides the following access interfaces: SOAP/WS: As a web service, for example using AXIS or .NET tools and/or manipulating the requests and responses using XPath and XSLT. Support for the OASIS DSS standard is one of its outstanding features. REST/WS, SOAP/WS: Using the integration gateway that enables XML traffic to be processed, delegating in TrustedX the usual data processing capacity (to transform, sign, verify, encipher, decipher, authenticate, authorise, etc.) which will be executed in pipeline to achieve the required data output. TrustedX API: Through an integration API so that applications can use the TrustedX services transparently, using both the Web Services standards and the OASIS DSS standard.
      TrustedX supports all standards for protecting documents and web services based on PKI infrastructure. Standards related to the protection of electronic documents: OASIS DSS, IETF PKCS#7/CMS, ETSI CAdES, IETF PDF Signature, W3C XML-Dsig, ETSI XAdES, W3C XML-Enc and IETF S/MIME Standards related to the protection of web services: OASIS WS-Security, SSL/TLS and OASIS SAML/XACML. Future versions will support OASIS WS-Trust and WS-Policy Public key infrastructure (PKI) standards: ITU-T X.509v3, IETF OCSP and IETF TSP Service infrastructure standards: SQL, HTTP, WebDAV, LDAP, PKCS#11
      TrustedX provides a centralized security and management solution, eliminating integration costs and avoiding the complexity inherent in programming models based on "silos". The "silo" integration strategy focuses on specific requirements, from a perspective which are isolated from the rest of applications. While this approach can address the initial needs, the lack of a common strategy propitiates different configuration, management and maintenance characteristics. This results in a increasing of the management costs and a decreasing of security as the number of applications increases. Otherwise, the TrustedX platform approach is related to provide a centralized and comprehensive security and trust functions to applications, being possible to adopt a Service-Oriented Integration strategy. This approach address the initial needs, while not increases the costs as the numbers the applications increases, thanks to the elimination of the redundant resources. This is in line with current trends in corporate information systems that are marking the end of the predominance of rigid software architectures.
      A trusted services provider is more than a system that centralises security protocols and functionality. One of its fundamental contributions is that it enables the uniform management of the trust domain, that is, the application of security policies in an ecosystem. In turn, it will also practice and apply the agreements required with other ecosystems to achieve the development of trust in a federated form. The trusted services provider has the following features: The ability to establish a uniform diagnosis of the information's level of trust. Said diagnosis is performed using the security data related to the operation (chains of digital certificates, revocation lists, time-stamps, etc.) taking into account the trust offered by the TTPs (CA, VA, TSA, etc.) that have generated them. Ease of integration and interoperability of electronic signatures and electronic encryption envelopes, encapsulating under a series of uniform services and a common interface (i) all standard formats (PKCS#7/CMS, CAdES, S/MIME, PDF-Signature, XML-DSig, XAdES, XML-Enc and WS-Security) and (ii) the complexity of the processing logic. Delegation of the configuration, maintenance and management of the security parameters in a centralised system based on policies, freeing consumers (users, applications and other web services) from its complexity and maintenance. Offering a centralised log and auditing system, and even a storage system that guarantees the management of cryptographic material over long periods of time.
      TrustedX has been conceived, designed and implemented as a business component (service) within the SOA architecture. By presenting itself with this philosophy, any business process will take advantage of the security and trust functionalities provided by TrustedX, which can be used either as service provider (request/response) or as Gateway (reception/resending). The following figure shows TrustedX in an SOA architecture:
      TrustedX is a trusted services provider, that is, it interprets the appropriate level of trust (through a metric of four levels: low, medium, high and very high) of the authentication mechanisms, signed documents and encrypted data, in line with the corporate security policies and adequately informing of the business processes as the basis for making decisions. For example, a Verisign certificate is not granted the same level of trust as an e-DNI (Spanish e-ID card), or a user who has been authenticated through login/password. Thus, in the case of the electronic signature, in addition to informing of its validity, TrustedX is capable of assessing the degree of trustworthiness in a simple way, thus preventing the processes from having to take on and implement this type of logic. Therefore, the semantic capacities associated with the TrustedX trust greatly simplify the integration efforts of applications in the corporate environment while enabling the trust levels established in corporate policies to be understood.
      TrustedX is currently the most modern and sophisticated trust manager based on PKI digital certificates. Its powerful use of digital certificate validation policies enables the configuration of any model of trust based on Certification Authorities (CAs), Validation Authorities (VAs), Time Stamping Authorities (TSAs) and Assertion Authorities (IdPs). With multi-CA support, TrustedX can be configured to trust any number of CAs and VAs, following any structure, whether hierarchical or crossed. It supports standard validation mechanisms based on CRL, OCSP and customised mechanisms (for example, a consultation to an information service) and establishes different trust levels for each of the different CAs, VAs or TSAs. It also supports the development of trust through the federation of identity providers (IdPs).
      Digital authentication univocally guarantees an entity’s identity and attributes (who is it and what is it?). Although the identity provides us with the name of a person or machine, the attributes offer us information regarding his or her capacity to practice as a qualified professional, credit limits, date of birth, etc. Trusting the identity of participants (customers, citizens, employees, professionals and organizations) is fundamental where relations are concerned; especially considering the increase in relations that are conducted by electronic means. The traditional password-based mechanisms are now thought to be insufficiently trustworthy. This highlights the fact that the use of secure authentication systems is crucial in terms of competitive needs. Safelayer technology is based on PKI technology and digital certificates—the most trustworthy form of multifactor authentication— and provides the utmost security for business assets. This technology has been adopted universally by the standards and the industry, and consequently guarantees investment protection. Through protocols such as SSL/TLS or IPSec, MSSP (Mobile Signature Service Provider) and WS-Security, mutual authentication in applications is strongly guaranteed: remote access control for employees, secure messaging service, interchange of data, authentication in corporate, financial and government Web portals. Moreover, the electronic signature can be used to guarantee non-repudiation.    
      Data integrity is the service offered by a Public Key Infrastructure that detects any changes that may have taken place accidentally or intentionally while data is stored or transmitted over the Internet. Authentication and integrity services are the basis of electronic signatures, which can be compared with hand-written signatures, thus removing the need for paper. The electronic signature resolves the problem of non-repudiation in electronic transactions and documents. Above all, the e-signature is a key instrument in optimizing business processes and reducing costs, which leads to more satisfied customers and users. Safelayer's TrustedX electronic signature service supports most signature formats for electronic documents, emails and web messaging. Supported formats include: multiple signatures, signatures with time stamps and long-term signatures (for validating a signature past the expiry date of the digital certificates).
      The confidentiality service offered by a Public Key Infrastructure enables electronic data (files and communications) to be protected, and controls access to the data by applying PKI-based authentication mechanisms. Public key technology has proved to be particularly appropriate in managing the risks of interchanging information over telematic networks and in protecting documents through the provision of identification and trust features. PKI lays the foundations for supporting the protection of sensitive data through the use of data encryption and digital certificates in managing the access control to such data. The level of data protection depends on the strength of the cryptographic algorithms and the keys used for encryption. TrustedX centrally and continually determines the required cryptographic parameters based on the encryption and decryption policies defined according to things such as the type of information, roles and applications. TrustedX key management service guarantees the secure administration of user/application keystores (on disk or HSM). This administration includes generating and importing keys, generating the certification requests and importing certificates.
    • Data and document protection with TrustedX

      TrustedX provides electronic signature services that are accessible through the protocols defined in the OASIS Digital Signature Service (DSS) standard. These protocols are profiled so that these functions are provided in different scenarios (CMS/PKCS#7, S/MIME, XMLDSig, WS-Security, PDF, etc): Generation of electronic signatures Verification of electronic signatures Updating signatures with evidences for non-repudiation Additionally, it also incorporates a series of data protection services that are accesible through protocols similar to those in DSS and which are profiled for providing the following functions in different scenarios (CMS/PKCS#7, S/MIME, XMLEnc, WS-Security, etc) : Document encryption Document decryption Simmetric key management
      The formats supported by TrustedX in relation to document protection are: Generic XML documents. Supports XML-DSig, XML-Encryption and XAdES (advanced digital signature format standardised by W3C and ETSI). Allows enveloping, embedded or separated signatures including signatures by reference of any node of an XML document. Documents with generic format. Supports PKCS#7/CMS and CAdES (advanced digital signature format standardised by ETSI). The S/MIME format used to protect e-mails, also supported by TrustedX, is an example of PKCS#7/CMS use. Allows simple and multiple digital signatures (sequential or parallel), in enveloping or separated digital signature format. PDF documents. Supports the digital signature format that is natively embedded in PDF documents (PDF digital signature).
      TrustedX supports the advanced electronic signature formats defined in ETSI CAdES/XAdES. To summarise: Basic electronic signature (BES) and explicit policy electronic signature (EPES), which includes the electronic signature and other basic information provided by the signer (for example, date of electronic signature, electronic signature policy, etc.). It is the form of electronic signature generated by the author of the electronic signature. Time-stamp electronic signature (ES-T), incorporated to the previous one and a time-stamp issued by a TSA. The time-stamp is added when the electronic signature is verified, although it can also be supplied by the author of the electronic signature. Electronic signature with complete validation information (ES-C) that adds to the previous one the series of references that will complete the electronic signature verification (chain of certificates and information on the certificates’ status). References are incorporated to the electronic signature after a sufficient amount of time or grace period. File electronic signature (ES-A). Includes all information required for its verification in the temporarily stamped electronic signature (when it includes this first time-stamp, it is called an extended electronic signature). Over time, the electronic signature will be renewed periodically using additional time-stamps. TrustedX performs these updates transparently, thus guaranteeing the longevity of stored electronic signatures.
      Usually, the loss of evidential information is related to the passing of time (evidence that was once valid ceases to be so at a certain time) or to the mobility of the signed document (in some implementations the evidence is only valid in the context of the actual file system). The solution to the problem is specified in the RFC 3126 of IETF (Internet Engineering Task Force) from 2001, later also adopted in the XAdES and CAdES standards by ETSI (European Telecommunications Standards Institute), both supported by TrustedX. These recommendations are based on ensuring electronic evidence (certificates, CRL or OCSP) using time-stamps. The implementation of these functions in TrustedX is associated with the electronic document custody and filing service, transparently guaranteeing non-repudiation over time. The service, based on LTANS (Long-Term Archive And Notary Systems), offers document registration functions (sending documents for their custody), status verification (verification of the status of one or more documents), document extraction (obtaining signed documents, in this case they may be verified by third parties), erasure (elimination of documents) and verification (verifying the documents’ signatures).
      The time-stamp endorses the existence of data before a specific date and it is essential for non-repudiation. The time-stamp of an electronic signature (ETSI ES-T) verifies that it existed before a certain date and that it was generated during the digital certificates’ validity period (for example, that it has not been signed after the certificate’s revocation date). Furthermore, to preserve an electronic signature over long periods of time, beyond the digital certificate’s validity period, proof of the existence of certificate status data (CRLs or OCSP) will be required at the time of signing. To do so, the electronic evidence will be filed together with the temporarily stamped electronic signatures, according to what is specified in the ETSI ES-A file electronic signature format. TrustedX supports the electronic signature profiles defined by the ETSI XAdES and CAdES standards: basic electronic signature (BES), explicit policy electronic signature (EPES), electronic signature with time-stamp (ES-T), complete electronic signature (ES-C) and file electronic signature (ES-A). The time-stamp format supported by TrustedX is IETF TSP (RFC 3161 Internet X.509 Public Key Infrastructure. Time-Stamp Protocol).
    • Web Services protection with TrustedX

      TrustedX supports SOAP (Simple Object Access Protocol) web services and REST (Representational State Transfer) style services, supporting the OASIS WS-* standards, to which it provides integrity, confidentiality and authentication mechanisms at SOAP message level, as specified by WS-Security. TrustedX also secures REST-style services based on XML (without SOAP) and the HTTP protocol, thanks to support from the XML-Dsig and XML-Enc standards. In addition to using its services as client-server, TrustedX can also transparently protect corporate services using an integration gateway that is included in the system. One of the outstanding features of the TrustedX integration gateway is that it enables messages to be processed using an XML pipeline language, so it is possible to transform a REST-style service to SOAP and secure it by applying WS-Security or vice versa.
      At SOAP messaging level, TrustedX supports the WS-Security authentication mechanisms. These mechanisms can be based on PKI, digital certificates and X.509 security tokens and the user number based on SAML (Security Assertion Markup Language). At transport level, TrustedX supports the mechanisms based on PKI (SSL/TLS certificateRequest) and user name (HTTP or SSL/TLS basic auth). TrustedX can delegate the authentication process to external systems, and it can be integrated in the corporate SSO unique access control system. In this case, authentication is delegated to an external agent, who is in charge of implementing the authentication mechanisms (for instance, kerberos, OTP or biometric) and delivering the credentials (authoritative authentication) to TrustedX.
      TrustedX incorporates an authorization system based on an RBAC (Role Based Access Control) monitor, which offers an authorization decision of i) allowed, ii) denied and iii) non-determined, according to a Role given to an application or user identity and the action requested on a certain resource. The authorization system controls access to the trusted services offered by TrustedX (as a trusted services provider) but it can also be used for managing the authorization to other services outside TrustedX. In this sense, TrustedX offers the following features: A standard interface for accessing authorization services Authorization Decision Query/Response of the SAMLP protocol or a Context Request/Response of XACML A behaviour as an authorization policy decision point (PDP) in which the authorization rules are assessed and TrustedX issues an authorization decision through the RBAC monitor A behaviour as authorization policy enforcement point (PEP) in which TrustedX delegates the authorization decision to an external component accessed through SAMLP or XACML.
      The inclusion of security methods in Web services, by means of TrustedX, involves the consumption of specialized services. Given that the entire logic of the security methods is delegated to TrustedX, the implementation of the WS-Security, XML-Dsig and XML-Enc standards for securing Web messages is carried out from a SOA point of view. The implementation architectures possible are: Integration gateway: TrustedX is used as a security gateway, in that, it is responsible for protecting the sent messages and for processing the received messages. This architecture has the advantage of being able to implement the security methods without having to modify the Web services. Security Web services: the application uses TrustedX to request protection or message processing. In this architecture, unlike the previous one, the TrustedX results are returned to the application. A combination of both. The figure below illustrates the two architectures: TrustedX's gateway functions include the capacity to receive SOAP/WS or REST/WS data, which are then processed by means of an XML Pipeline language, and finally, they are re-sent. The gateway's pipeline capacities make it possible to link a successive set of XML processing rules: transform, sign, verify, encrypt, decrypt, authenticate, authorize, access external information sources etc. which will be executed to obtain the desired data output. TrustedX will also be responsible for the following aspects which are key to managing trust. Private key management, where TrustedX provides protection methods to prevent key copying or its unauthorized use, and renders key and digital certificate renewal transparent to the applications. Centralized security policy and trust management, where the centralized management of the trusted entities (CAs, TAs and VAs) is made possible. Also, the requirements for validating digital certificates and verifying signatures are defined. Facilitation of the auditing of the system and of the security methods in the corporate SOA architecture.
    • Virtual Smart Card

      The virtual smart card is an application that allows Windows users to perform cryptographic operations with the keys they have in TrustedX in the same way as they would with keys on cryptographic cards. So, it's an application via which Windows applications can see TrustedX as a cryptographic card. To perform this function, the virtual smart card includes an implementation of the PKCS #11 interface and a Windows CSP.
      The main benefit of the virtual smart card is that it allows centrally managing the personal certificates and keys that users use in their desktop applications. This enables effectively controlling these keys and certificates with corporate-wide policies and providing an auditable trace of their use in a central event log. All of which is done transparently for the applications. Centralization means being able to assure that an adequate protection level is applied to the user keys: The keys can be physically stored in an HSM. Strong authentication (based on two factors) can be made a requirement for users to access their keys. You can exercise precise control over when and how the keys are used. For example, you can make the keys accessible only in a defined time range. Another benefit of the virtual smart card is that it allows multiple users to share and use the same keys in their own machines, without each user needing to have a copy of the keys. This allows conveniently and easily managing the keys and certificates in scenarios in which multiple users need to use corporate keys.  
      The main uses of the virtual smart card are the following: Signing Word, Excel, PowerPoint and PDF documents using the corresponding editing tools (Office, Adobe Acrobat, Acrobat Reader). Signing Web forms in browsers (IE, Chrome, Firefox) using a signature applet. Signing and decrypting emails (Outlook, Thunderbird). Client authentication for accessing secure Web servers using TLS/SSL with mutual authentication (IE, Chrome, Firefox). Generating keys and certificate requests with browsers (IE, Chrome, Firefox). Integration with Windows mechanisms for manual enrollment and auto-enrollment.  
      The Windows applications access the virtual smart card in the following ways: Directly, via the MS-CAPI interface. For example, Word, Excel, PowerPoint, Acrobat, Outlook, Internet Explorer and Chrome. Directly, via the PKCS #11 interface. For example, with the pkcs11-tool from the OpenSC project. Indirectly, via the PKCS #11 interface using Mozilla's NSS library. For example, Firefox and Thunderbird. Indirectly, via the PKCS #11 and MS-CAPI interfaces using JCA/JCE and the required cryptographic providers (e.g. SunPKCS11, SunMSCAPI) . For example the OpenSignX signature applet.    
    • Watched folders

      Watched folders are a file-exchange based mechanism for accessing TrustedX services. Client applications copy files to input folders checked regularly by TrustedX. TrustedX processes the files it detects and creates a second, output file that is copied to an output folder. Files are processed according to which folder TrustedX finds them in. The client application collects the file returned by TrustedX from the output folder. The files are searched for in the input folders and processed by background tasks launched by the system's Scheduler. The characteristics of these tasks (e.g., their frequency) can be configured.
      Watched folders provide a mechanism for accessing the TrustedX services that can be easily integrated into applications at a low cost. When using this mechanism in Windows Explorer, you can perform tasks such as signing documents just by copying and pasting (or dragging and dropping) them to the input folder. Applications that run unattended can usually process input and output data by reading and writing files. By making use of this capability along with the watched folders, integrating the e-signing of data in these applications is extremely straightforward.
      Watched folders are usually external to the appliance and the systems in which client applications run. So, both the applications and TrustedX access the watched folders using NFS or SMB/CIFS.
      TrustedX's processing of a file pasted to the input folder of a watched folder entails executing a pipeline through which the file passes to obtain the resulting file that is copied to the output folder. Which pipeline is executed for processing a file is determined by the SmartGateway policy associated to the watched folder, i.e., to the pair of input and output folders involved.
      The pipelines executed to process the input files are fully configurable because they are defined using an extended version of the Smallx language. As a result, the processing (i.e., the actions) performed on the input files is not predefined and can be programed. So, the watched folders can be used to perform any type of processing on the input files.
      A watched folder is not just a pair of input and output folders. It has a more complex structure that includes the following folders: Base folder: contains the other folders and is, strictly speaking, the watched folder as it is the folder associated to a given SmartGateway policy. Temporary or pre-input folder: used by the client application to copy the input files to before they are moved to the input folder. As it is the second atomic operation, it avoids the simultaneous writing and reading of the input files by the client applications and TrustedX respectively. Input folder: periodically checked by TrustedX, which collects the files it finds. Working folder: the input files are sent to this folder once TrustedX has scheduled their processing. TrustedX also copies to this folder the files obtained as a result of processing the input files prior to moving them to the output folder. This avoids the simultaneous writing and reading of output files by TrustedX and the client applications respectively. Output folder: TrustedX copies to this folder the files obtained as a result of processing the input files. These files are sent from the working folder where they are first copied. Error folder: contains files that describe the errors that occur during the processing of input files. Processed-file folder: successfully processed input files are copied to this folder. Files-processed-with-error folder: input files that could not be successfully processed are copied to this folder.
    • eSignatures in Document Management Systems

      Alfresco Signature Workflows is a TrustedX signature plug-in for Alfresco. With Alfresco Signature Workflows installed, Alfresco users can generate and verify signatures in TrustedX using the Alfresco Share graphical interface. To make this possible, the plug-in adds the Sign and Verify commands to the Alfresco Share menu that lists the actions that can be performed on content stored in Alfresco. Alfresco Signature Workflows also supports starting workflows for generating n parallel or serial signatures for content stored in Alfresco.
  • Public Key Infrastructure

    KeyOne is Safelayer's product family for implementing all the services of a PKI (public key infrastructure): Digital certificate management (request registration, renewal, revocation). Digital certificate validation (by CRL download or OCSP requests). Electronic time-stamping (for generating evidences that prove the existence of specific data at a certain time). This functionality complies with the main market standards, such as: The ITU-T X.509v3 standard for managing the digital certificate life-cycle. The IETF standards for validating certificates using the OCSP protocol and generating time-stamps as per the TSP protocol.
    Public key infrastructures (PKI) provide the services for establishing trusted electronic relations. In this context, the Trusted Third Parties (TTPs): Guarantee the univocal relation between the entities and the socio-economic data attributed to them. Univocally associate a given date to specific data. Provides proof that established relations are still valid. There are three types of trusted entities:   Certification Authorities for issuing and managing digital certificates. Validation Authorities for guaranteeing the validity of digital certificates. Time-stamp Authorities for guaranteeing the existence of data at a given time. The KeyOne application family provides all the components for implementing these entities. Certification Authority In a PKI, two authorities play a role in issuing certificates: The Certification Authority (CA), which is exclusively devoted to issuing certificates and Digital Certificate Revocation Lists (CRL). The Registration Authority (RA) to which the CA delegates the tasks of receiving requests (for digital certificate issue, renewal and revocation) and approving or rejecting them. While the RA component interacts with entities requesting certificates (e.g., corporate users) and the company's decision-making system (for obtaining requester data and attributes), the CA is left solely to process approved requests and generate certificates. Validation Authority Business processes that involve accepting digitally-signed data (e.g., electronic transaction orders) require the prior validation of the electronic signature certificates used. The validation service of the Validation Authority (VA) checks the validity of a digital certificate at a given time; i.e., it determines if the digital certificate is valid or revoked. This is how the KeyOne VA online service provides a much more efficient way of validating certificates than that provided by the traditional model of downloading CRLs issued by the CA. Time-Stamp Authority The Time-Stamp Authority (TSA) issues digitally-signed time-stamps to prove the existence of data at a given time. In the case of the digital signatures, the time-stamps support guaranteeing the existence of the electronic signature and the data signed at a given moment in time. The use of time-stamps is critical in environments such as these: Public administration services that need to guarantee delivery dates. Electronic notary services. For submitting offer validity.
    KeyOne products supports face-to-face, remote and automatic registration. Face-to-Face Registration The requester gets the certificates in one step (after making the request). For example, the KeyOne LXRA client application for registration operators supports deploying a face-to-face system that is close to the requester and centrally managed by KeyOne XRA. This model supports integrating cryptographic mechanisms such as card printers, meaning the keys and their certificates can be given to the owner on a smart card. Remote Registration Request and reception of certificates is done remotely. For example, from a Web portal or a corporate application that accesses KeyOne XRA (where all the information is centralized) via SOAP/XML. This model supports pre-authorized requests and requests that are subsequently approved by a registration operator. Automatic Registration The functions of the registration system (accessible as a Web service) are invoked remotely for approving the registration, renovation and revocation of digital certificates. As with other the modes, connection with KeyOne XRA is carried out securely (via HTTPS and SOAP/XML). The information on the users and applications is obtained from a database, directory or corporate application.
    KeyOne products comply with service-orientated architecture (SOA) standards to reduce integration and maintenance costs.   So, the functionality of the KeyOne applications can be activated as remote services and made available via a simple SOAP/XML interface. Integration with the Registration System Integration with the registration system can be implemented in the following ways: Via Safelayer's KeyOne XRA registration authority. Via a corporate application that acts as the Registration Authority and accesses KeyOne CA's SOAP/WS interface. Integration with the Validation System Integration with the validation system can be implemented in the following ways: By publishing revocation lists in an external repository (e.g., a directory or Web server). Via the CertStatus service that responds to requests from the KeyOne VA validation authorities.  
    In KeyOne, the user, key and digital certificate life-cycles follow a fully-configurable workflow. The default operation of the KeyOne applications (in particular KeyOne CA and KeyOne XRA) can be customized to adapt to the differing needs of key management and registration procedures. The workflow-configurable operations entail application management (e.g., the issue and renewal of own keys) and the provision of services to third parties (e.g., the approval of requests in KeyOne XRA, the issue and publication of certificates in KeyOne CA). These tasks can also be automated (e.g., for the automatic renewal of the TSA or VA own keys). This minimizes the amount of manual procedures, guaranteeing service continuity.
    KeyOne components feature two mechanisms for reporting on the validity of the digital certificates: The generation of evidence on digital certificate status (as per the IETF OCSP standard) using the online services. The issue of digital certificate revocation lists via automatisms. KeyOne VA The KeyOne VA validation authority is an ideal system for the critical processes of electronic-signature verification. The main advantages of this system are: Proof of response delivery. Greater efficiency in validating digital certificate status. The KeyOne VA validation service is based on the IETF OCSP protocol. This service: Responds to the requests for information on the status of digital certificates used in the signing of electronic transactions. These requests can come from users or service providers. Stores information on the status of certificates generated by one or more Certification Authorities. Guarantees the non-repudiation of the responses. To do this, the responses include a digital signature from the Validation Authority that specifies the date and status (valid, revoked, suspended or unknown) of the certificate. KeyOne VA can operate with a HSM (network or internal) and requires access to a database and a network time-source. KeyOne VA uses the system clock. This clock can be synchronized with reliable sources (e.g., GPS, clocks with high-stability oscillators) using non-KeyOne mechanisms (e.g., NTP).  
    The KeyOne products offer a complete cryptographic key management service. Copy and Recovery The KeyOne KA extension for KeyOne CA supports: Saving keys generated by KeyOne CA (normally encryption keys). Getting these keys in RSA format. The keys are obtained in a PKCS #12 protected by a key divided in N parts. Each part can only be accessed by one operator. Revocation KeyOne XRA and KeyOne LXRA operators can revoke keys. This revoking is reported by: Generating revocation lists. Querying, via OCSP, the KeyOne VA service.
    Safelayer solutions are being used in the three biggest digital identification and certification projects underway in Spain (aimed at over 40 million people). The Spanish electronic national identity card (www.dnielectronico.es). The e-passport. The Certification Authority of the Royal Spanish Mint (www.cert.fnmt.es). To facilitate this type of architectures, Safelayer solutions support the following load and availability requirements. Transactionality KeyOne application operations are carried out transactionally. For this, the KeyOne applications installed in more than one server access the same database. This guarantees the consistency of the data used by the service. This data is also visible from any of the applications. Shared Access to Cryptographic Material KeyOne applications can share the cryptographic material (cryptographic hardware and private keys). This material can be replicated for each application. Monitoring and Statistics KeyOne services incorporate a monitoring and statistics system that provides service-quality parameters. For example: Number of threads available or occupied. Number of clients available in the request queue. Time interval for updating statistics. Number of responses in the last statistics interval. Average time for processing a request in the last statistics interval. Maximum time for processing a request in the last statistics interval. Remote Operation Service KeyOne services can be remotely started, stopped or refreshed via the remote operation service. This service (available for KeyOne CA, KeyOne TSA, KeyOne VA and KeyOne XRA) is accessible via SSL and strong authentication based on digital certificates.
    KeyOne product operation adheres to the principal security and control requirements. Common Criteria EAL4+ KeyOne version 3.0 products comply with CWA 14167-1 requirements for managing digital certificates used in electronic signatures. In addition, the family of KeyOne 3.0 products are ISO/IEC 15408 EAL4+(ALC_FLR.2) certified in compliance with the CIMC Security Level 3 Protection Profile (Certificate Issuing and Management Component, NIST, 31 October 2001), as you can see at: www.oc.ccn.cni.es KeyOne CA version 4.0 is in certification process to achieve CC EAL4+ (ALC_FLR.2). The configuration of the KeyOne applications also supports forcing operation in CWA 14167-1 and CIMC NSA/NIST modes (defining roles and events) and defining a customized security level. Event Logging KeyOne products feature an extremely flexible and reliable event logging system. This system's functionality includes: Selecting the events to be logged. Configuring the degree of severity associated to each event type. Protection against log failure (emergency log records). Protection of logging integrity. Cryptographic Hardware The KeyOne applications support FIPS 140-2 level 3 hardware security modules (HSM). These mechanisms incorporate M of N access-control operators to guarantee the maximum protection of the private keys. Role Management KeyOne features the CWA 14167-1 mechanisms for managing roles, auditing and reporting recommended for managing the digital certificates used in electronic signatures. AfrikaansAlbanianArabicArmenianAzerbaijaniBasqueBelarusianBulgarianCatalanChinese (Simplified)Chinese (Traditional)CroatianCzechDanishDetect languageDutchEnglishEstonianFilipinoFinnishFrenchGalicianGeorgianGermanGreekHaitian CreoleHebrewHindiHungarianIcelandicIndonesianIrishItalianJapaneseKoreanLatinLatvianLithuanianMacedonianMalayMalteseNorwegianPersianPolishPortugueseRomanianRussianSerbianSlovakSlovenianSpanishSwahiliSwedishThaiTurkishUkrainianUrduVietnameseWelshYiddish?AfrikaansAlbanianArabicArmenianAzerbaijaniBasqueBelarusianBulgarianCatalanChinese (Simplified)Chinese (Traditional)CroatianCzechDanishDutchEnglishEstonianFilipinoFinnishFrenchGalicianGeorgianGermanGreekHaitian CreoleHebrewHindiHungarianIcelandicIndonesianIrishItalianJapaneseKoreanLatinLatvianLithuanianMacedonianMalayMalteseNorwegianPersianPolishPortugueseRomanianRussianSerbianSlovakSlovenianSpanishSwahiliSwedishThaiTurkishUkrainianUrduVietnameseWelshYiddish English (auto-detected) » Spanish
    Safelayer's KeyOne family provides a complete solution for deploying the e-passport PKI, both the ICAO (e-passport signing) and the EAC (e-passport verification) parts. This solution passed the latest tests undertaken in Prague (www.e-passports2008.org). KeyOne Products for the ICAO PKI The KeyOne products for the ICAO part include all the entities for issuing first generation e-passports. The CSCA (Country Signing Certification Authority) issues certificates for national DSs (Document Signer). The DSs signs the e-passports. The NPKD (National Public Key Directory) distributes the certificates and the CRLs used by the ISs (Inspection Systems) to validate e-passports. These products have already been deployed for issuing first generation Spanish e-passports. KeyOne Products for the EAC PKI The KeyOne products for the EAC part includes all the entities (except the IS (Inspection System)) for verifying second generation e-passports. The CVCA (Country Verifying Certification Authority) issues CV digital certificates for the DVs (Document Verifiers). The DVs issue CV digital certificates for the national ISs. The CVRA (Country Verifying Registration Authority) acts as a SPOC (Single Point of Contact) and a national Registration Authority. These products have been deployed in a first phase for verifying second generation Spanish e-passports (pending the final standardization of the communication processes). See e-Passport SPOC in KeyOne for a more detailed description of the KeyOne components for the EAC PKI.
    KeyOne supports a long list of technical standards to guarantee interoperability and ROI: HSM/Token support: PKCS#11 devices approved by Safelayer. Digital certificate format: ITU-T X.509v3 Proprietary extensions: IETF, Microsoft and Netscape (with the possibility to incorporate any extension) Digital certificate request format: Self-signed ITU-T X.509v3 or PKCS#10 Digital certificate delivery format: ITU-T X.509v3, PKCS#7 or PKCS#12 Revocation list format: ITU-T X.509v2 (CRLv2) Data protection: IETF CMS/PKCS#7, IETF S/MIME, IEFT electronic signature for PDF in PKCS#7 container Online digital certificate validation protocol: RFC2560 Time-stamp protocol: RFC3161 Directory: Microsoft Active Directory and LDAP (with a schema adaptable to KeyOne RA and KeyOne CA) Encryption algorithms: DES, TRIPLE DES, AES-128, AES-256, RC2, RC4 Electronic signature algorithms: RSA until 4096, DSA, ECDSA Hash algorithms: SHA1, SHA256, SHA384, SHA512, MD2, MD5, RIPEMD-160

We use cookies to improve our website and your experience when using it. Cookies used for the essential operation of this site have already been set. To find out more about the cookies we use and how to delete them, see our Privacy Policy.I accept cookies from this site