User migration CSV import

Introduction

Migrating user repositories, whether big or small, can be a daunting task but Verify provides a simple way to. The standard process for onboarding Verify users is to use the provided REST API. This guide will walk you through onboarding a mass amount of users in a very short timeframe.

Verify supports the following ways to import user data:

  1. Manual entry
  2. CSV import API
  3. User API (single)
  4. Bulk load API (multiple)

In this guide, we will use the CSV import method as it is the most common.

Gathering data

Format of the data

To import users via the CSV method, Verify requires that your user repository (whether its LDAP or a SQL database) be converted into a UTF-8 compatible CSV format. Ensure that the method that you plan to convert your data into supports this format.

Type of user

It is important to note the type of users you will be importing. If you're not familiar with Verify's Cloud Directory model, read the article on the structure of Verify's directory. In this guide, we will be importing users to have a standard cloud directory user account, with a local password, generated upon user creation. This type of user is determined by the regular user type. If you wish to import just the metadata of the user without a local password, you will need to replace two attribute values: realm and userCategory.

Access token

The API client used to import users must have at least one of the following permissions in IBM Security Verify:

  • Manage users and groups, or
  • Synchronize users and groups, or
  • Manage users and standard groups

The application must have acquired an Access Token using the Client Credentials flow.

Get CSV headers

To properly structure your CSV file for import, you'll need to first understand what header options there are. If you're already familiar with the user schema, the header name equals the attribute's identifier in SCIM or if using a custom attribute, the value specified during attribute creation.

curl -X GET "https://${tenant_url}/v2.0/CSV/headerNames?filter=user" -H "Authorization: Bearer ${access_token}"
{
    "headerNames": [
        {
            "dataType": "string",
            "name": "preferred_username", // This is a column header name
            "required": true // This indicates whether you must include it
        },
        {
            "dataType": "string",
            "name": "family_name",
            "required": false
        },
        {
            "dataType": "string",
            "name": "given_name",
            "required": false
        },
        {...},
        {
            "dataType": "string",
            "name": "password",
            "required": false
        }
    ]
}

This response will include all possible header names. The name that you will use in the a column in the CSV is in the name object name. In the call above, we added a user filter. Without this, we would get attribute names for Groups as well, which is not what we want.

Create the CSV file

If we wanted to build a quick and minimum CSV file of the basic user fields, with an autogenerated password, this is what it would look like:

preferred_usernamegiven_namefamily_nameemail
user1JohnDoe[email protected]
user2JaneSmith[email protected]
user3BarbaraJordan[email protected]

Note: The maximum file size is 10MB. If your file is larger, break it up into multiple CSV files.

Append more columns as necessary. Attribute values can always be updated after the fact in the admin console. Save this file as a UTF-8 encoded CSV text file.

❗️

Excel users, be aware!

If you exported your CSV from excel, the encoding often times includes some odd characters at the beginning of the file. This is a simple fix. Simply open the file in a raw text editor and delete the characters. The characters look like:  and is known as a Byte Order Mark (BOM). If you don't see the characters in the text editor, copy and paste the text into a new text editor and save as a new file.

Password migration

Verify supports onboarding existing user password hashes and passwords given in plain text. If you want create users with a known / hashed password, add a column called password to each user row, Verify will create a user with that password. If your database or LDAP provides the one-way hash in one of the supported formats then you will need to get the value and prefix it with {TYPE}. For example, If you are using Salted-SHA256, the value must be prefixed with {SSHA256}.

Note that if your password has a '+' sign, escape any “+” signs by replacing them with “%2B”.

Example CSV file

Here's an example CSV file with minimum information:

preferred_username,given_name,family_name,email
user1,John,Doe,[email protected]
user2,Jane,Smith,[email protected]
user3,Barbara,Jordan,[email protected]

Note that spaces after commas will break the import.

Import the file

Once your file has been created, there are a few more configurations that are supported. In the call below, we will suppress all email notifications of new accounts through the parameter notifyType=NONE. Additionally, we are creating a random password on first login, but we don't want the user to have to reset it (we'll allow them to reset it on their own time). In order to suppress the change password requirement on first login, we add the header usershouldnotneedtoresetpassword: true.

curl --location -X POST "https://${tenant_url}/v2.0/CSV/importUsers?notifyType=NONE" \
--header "Authorization: Bearer ${access_token}" \
--header 'usershouldnotneedtoresetpassword: true' \
--form 'file=@/path/to/file/user_import.csv'

If the file was formatted correctly and accepted, you'll receive an ID back. This can be used to poll the status API to see if everything is working OK. Users are processed linearly. A max file size (10mb) file may take around 15-20 minutes to process.

Check import status

While users are being processed, you can easily see the current status of the import by calling the CSV jobs API.

curl --location --request GET "https://${tenant_url}/v2.0/CSV/jobs/${id}" \
--header "Authorization: Bearer ${access_token}"

The response will provide current state and the statistics on number of users that are unprocessed (unprocessedCount) and that have been processed and added (processedCount). Should there be any errors, a count will be provided (errorsCount).

📘

Migrating millions of users? No problem.

While migrating through APIs is possible for millions of users through the methods below, it is entirely more efficient to work with our team to schedule the bulk load on your behalf. Customers may open a support ticket with IBM support by using the web-based IBM Support Community or by telephone (1-800-IBM-SERV (800-426-7378). The service management team, will follow-up with a secure upload location. You will then upload your user data to this location and the service management team will load in IBM Security Verify.