User migration CSV import

Introduction

Verify simplifies and streamlines the complex task of migrating large user volume using APIs. The standard process of onboarding Verify users is to use the provided REST API. This guide walks you through onboarding a high volume of users in a short span of time.

Verify supports the following ways to import user data:

  1. Manual entry
  2. CSV import API
  3. User API (single)
  4. Bulk load API (multiple)

In this guide, the most common CSV export method is used.

Gathering data

Format of the data

To import users through the CSV method, Verify requires that your user repository (whether it is a LDAP or a SQL database) is converted into a UTF-8 compatible CSV format. Ensure that the method that is used to convert your data supports this format.

Type of user

It is important to note the type of users that are imported. See the article on the structure of Verify's directory to know more about the Verify’s Cloud Directory model. In this guide, the users are imported to have a standard cloud directory user account, with a local password, generated upon user creation. This type of user is determined by the regular user type. If you wish to import just the metadata of the user without a local password, you need to replace two attribute values: realm and userCategory

Access token

The API client used to import users must have at least one of the following permissions in IBM Verify:

  • Manage users and groups, or
  • Synchronize users and groups, or
  • Manage users and standard groups

The application must have acquired an Access Token using the Client Credentials flow.

Get CSV headers

📘

Using custom attribute names?

Most of the organisations have multiple attributes to hold user data. Verify may not have all the attributes, hence, it is required to create various custom attributes before calling the get CSV headers API. See Working with custom attributes section to understand how to create custom attributes.

To properly structure your CSV file for import, you need to first understand the header options. If you're already familiar with the user schema, the header name equals the attribute's identifier in SCIM or if using a custom attribute, the value specified during attribute creation.

curl -X GET "https://${tenant_url}/v2.0/CSV/headerNames?filter=user" -H "Authorization: Bearer ${access_token}"
{
    "headerNames": [
        {
            "dataType": "string",
            "name": "preferred_username", // This is a column header name
            "required": true // This indicates whether you must include it
        },
        {
            "dataType": "string",
            "name": "family_name",
            "required": false
        },
        {
            "dataType": "string",
            "name": "given_name",
            "required": false
        },
        {...},
        {
            "dataType": "string",
            "name": "password",
            "required": false
        },
        {
            "dataType": "string",
            "name": "customAttr1",
            "required": false
        }
    ]
}

This response includes all possible header names. The name attributes from the response received has to be used as the column name in the CSV file. In the preceding call, a user filter is added to avoid getting Groups' attribute names.

Create the CSV file

The following displays a simple CSV file built with basic user fields and an autogenerated password:

preferred_usernamegiven_namefamily_nameemailcustomAttr1
user1JohnDoe[email protected]custdata1
user2JaneSmith[email protected]custdata2
user3BarbaraJordan[email protected]custdata3

Note: The maximum file size is 10MB. If your file is larger, break it up into multiple CSV files.

Append more columns as necessary. Attribute values can always be updated after the fact in the admin console. Save this file as a UTF-8 encoded CSV text file.

❗

Note: Excel Users

If you exported your CSV from Excel, the encoding often includes some odd characters at the beginning of the file. This is a simple fix. Simply open the file in a raw text editor and delete the characters. The characters look like:  and is known as a Byte Order Mark (BOM). If you don't see the characters in the text editor, copy and paste the text into a new text editor and save as a new file.

Password migration

Verify supports onboarding existing user password hashes and passwords given in plain text. If you want Verify to create users with a known or hashed password, add a column called password to each user row. If your database or LDAP provides the one-way hash in one of the supported formats, then you need to get the value and prefix it with {TYPE}. For example, If you are using Salted-SHA256, the value must be prefixed with {SSHA256}.

Note that if your password has a '+' sign, escape any “+” signs by replacing them with “%2B”.

Example CSV file

Here's an example CSV file with minimum information:

preferred_username,given_name,family_name,email
user1,John,Doe,[email protected]
user2,Jane,Smith,[email protected]
user3,Barbara,Jordan,[email protected]

Note that spaces after commas breaks the import.

Import the file

Once your file is created, there are a few more supported configurations. In the following call, suppress all the email notifications of new accounts through the parameter notifyType=NONE. Additionally, create a random password on first login, without having the user to reset it (allow them to reset it in their own time). In order to suppress the change password requirement on first login, add the header usershouldnotneedtoresetpassword: true.

curl --location -X POST "https://${tenant_url}/v2.0/CSV/importUsers?notifyType=NONE" \
--header "Authorization: Bearer ${access_token}" \
--header 'usershouldnotneedtoresetpassword: true' \
--form 'file=@/path/to/file/user_import.csv'

If the file is formatted correctly and accepted, an ID is received. This can be used to poll the status API to see if everything is working OK. Users are processed linearly. A max file size (10mb) file may take around 15-20 minutes to process.

Check import status

While users are processed, you can easily see the current status of the import by calling the CSV jobs API.

curl --location --request GET "https://${tenant_url}/v2.0/CSV/jobs/${id}" \
--header "Authorization: Bearer ${access_token}"

The response provides the following status:

  • Current state (state)
  • Number of users that are unprocessed (unprocessedCount)
  • Number of users that are processed and added (processedCount)
  • If there is any error, the count is provided (errorsCount)

📘

Migrating large volume of users? No problem.

Verify handles the challenging task of migrating large user data through APIs. Customers may open a support ticket with IBM support requesting bulk loading of user data. A secure upload location would be provided through this process.