Adding your catalog into Kibo’s order management system (OMS) is an easy way to allow your customer support representatives to create offline orders, add new items to shipments, and replace returned items.
The advice here is for Shopify specifically, but it can be adapted to any ecommerce system or PIM. We will be leveraging the Kibo Import/Export APIs for efficient imports of product data.
First, we’ll outline how to implement a bulk load of all products, which can then be modified into delta updates after the first full upload. The examples will be in TypeScript and using the Kibo TypeScript REST SDK, but you can use any SDK provided by Kibo or craft the API request yourself. Because the Kibo APIs are standard REST APIs accessed using Bearer OAuth authentication, you can easily write connector code using the programming language of your choice.
Deciding on the catalog structure
Before you write any code relating to your catalog import, you want to make sure that the structure of the catalog is designed correctly and matches your use cases. This involves some choices:
- If you’re going to use configurable variants
- If you want to use multiple product types or store everything in a single product type
- Your child catalogs and how they are represented
- If you are using pricelists or resolving pricing using the price field on the product
This page can help you understand the different trade-offs when designing the schema of your catalog.
In this particular scenario, we won’t use configurable variants. We’ll store everything in a single product type and use a single child catalog. This maximizes simplicity, especially as the catalog’s source of the truth is coming from an external system, and admin users will not be changing the product data in Kibo.
Step 1: Pull the Shopify catalog data
First, we’ll download the entire Shopify catalog locally so that we can process it into CSV files in the next step, which will later be ingested by Kibo.
Every ecommerce system has a different way of retrieving product data. In Shopify, you use the Get Products API. Shopify implements paging by providing a “next” header parameter, which returns a link to the next page of products. Writing a recursive implementation yields the following code:
export const downloadFilesToTemp = async (tempdir: string) => {
async function fetchAndSaveProducts(url: string | undefined, page: number = 1): Promise<void> {
url = `https://${process.env.SHOPIFY_STORE_URL_US}/admin/api/${API_VERSION}/products.json`;
const response = await axios.get(url, {
headers: {
'Content-Type': 'application/json',
'X-Shopify-Access-Token': process.env.SHOPIFY_API_KEY
},
});
const data = await response.data as any;
const products = data.products as ProductListing;
// Save products to a JSON file
const filePath = join('/tmp/', `${tempdir}/${leftpad(String(page), 2, "0")}.json`);
writeFileSync(filePath, JSON.stringify(products, null, 2));
// Extract the next page URL from the Link header
const linkHeader = response.headers['link'];
if (linkHeader) {
const links = linkHeader.split(',');
const nextLink = links.find((link: string) => link.includes('rel="next"'));
if (nextLink) {
const nextPageUrl = nextLink.match(/<(.*?)>/)?.[1];
if (nextPageUrl) {
await fetchProducts(nextPageUrl, page + 1, geo);
}
}
}
}
await fetchProducts(undefined)
}
This fetches all products into a temporary directory to be processed by the next step. We process and save files in the temporary directory instead of keeping all of the representations in memory so that we don’t run into memory issues with extremely large catalogs.
Step 2: Transform to CSV
The entire catalog can be represented by three files in the bulk catalog import structure in Kibo: products.csv, productcatalog.csv, and productimages.csv
products.csv:
This file contains product data at the master catalog level – the general catalog fields including the basic ones like product name, description, and any custom properties.
productcatalog.csv:
This file contains information about the product at the child catalog level. This includes information about whether the product is active or not in that catalog, and if any of the content, price, or SEO information is overridden, and, if it is, the values of those properties.
productimages.csv:
Because one product can have multiple images, the images associated with the product are split out into a separate file. This file associates each product’s image, its sequence, and the URL associated with that image.
files.forEach(file => {
const fileContent = readFileSync(join(filePath, file), 'utf-8');
const productListing = JSON.parse(fileContent) as Product[];
for (const product of productListing) {
for (const variant of (product.variants as Variant[]) || []) {
if (variant.sku) {
productCatalogLines.push(createProductCatalogMapping(product, variant));
}
const variantImages: ProductImages[] = [];
if (variant.sku)
for (const image of (product.images as Image[]) || []) {
if (image.product_id == variant.product_id) {
variantImages.push(createProductImageMapping(image, product));
}
}
for (const image of variantImages) {
productImages.push(image);
}
if (variant.sku) {
products.push(createProductMapping(product, variant));
}
}
}
});
The associated mapping functions look like this:
const createProductMapping = (
product: Product,
variant: Variant
): ProductKibo => {
return {
MasterCatalogName: 'My Master Catalog',
ProductCode: String(variant.sku),
ProductType: 'Default',
ProductUsage: 'Standard',
ManufacturerPartNumber: '',
UPC: '',
DistributorPartNumber: '',
IsTaxable: 'True',
ManageStock: 'False',
'Ships by itself': 'False',
OutOfStockBehavior: 'DisplayMessage',
PackageWeight: (variant.weight || 0) > 0 ? String(variant.weight) : '0.01',
PackageWeightUnitId: variant.weight_unit || 'kg',
PackageLength: '1',
PackageLengthUnitId: 'in',
PackageWidth: '1',
PackageWidthUnitId: 'in',
PackageHeight: '1',
PackageHeightUnitId: 'in',
FulfillmentTypes: 'DirectShip&InStorePickup',
RestrictDiscount: 'False',
RestrictDiscountStartDate: '',
RestrictDiscountEndDate: '',
VariationPricingMethod: '',
Price: variant.price || '0',
ProductName: `${product.title} ${variant.title}`,
ProductShortDescription: '',
ContentFullProductDescription: product.body_html || '',
availability: '',
barcode: variant.barcode || '',
rating: ''
};
};
Any custom attributes you can add to the definition and then map them at the bottom of this function.
Step 3: Generate the Zip file for uploading to Kibo
This part is straightforward. We simply use AdmZip to generate the zip file for uploading to Kibo.
import AdmZip from 'adm-zip'
import fs from 'fs'
import { join } from 'path'
export async function zipOutputFiles(tempdir: string) {
// creating archives
var zip = new AdmZip();
zip.addFile("products.csv", fs.readFileSync(join("/tmp", tempdir, "output", "products.csv")), "");
zip.addFile("productimages.csv", fs.readFileSync(join("/tmp", tempdir, "output", "productimages.csv")), "");
zip.addFile("productcatalog.csv", fs.readFileSync(join("/tmp", tempdir, "output", "productcatalog.csv")), "");
zip.writeZip(join("/tmp", tempdir, "output", "result.zip"));
}
Step 4: Upload Zip file to Kibo
Read this overview for some more details on how the Import Export APIs work.
export const importJob = async ({
params,
zipFilePath,
}: {
workflowParams: WorkflowParams;
zipFilePath: string;
}) => {
const { kiboApiService, logger, apiContext } = params;
const importJob = {
name: "Shopifty Catalog Import",
format: "Legacy",
domain: "catalog",
contextOverride: {
locale: apiContext.localeCode,
currency: apiContext.currencyCode,
masterCatalog: apiContext.masterCatalogId,
catalog: apiContext.catalogId,
site: apiContext.siteId,
},
resources: ["ProductCatalog", "ProductImages", "Products"].map((e) => {
return { format: "Legacy", resource: e, deleteOmitted: true };
}),
};
const scheduledJob = await kiboApiService.createImport({ zipFilePath, job: importJob });
logger.info( `Import job created successfully siteId: ${apiContext.siteId} import: ${scheduledJob.id}`, {scheduledJob});
};
And that’s all it takes! We have successfully pulled all the products from Shopify, transformed them, and uploaded them into Kibo which now has a fully functioning catalog.
The next modification you can make is running a full export hourly to keep the data in sync. If you would like to run at a shorter timeframe, you can convert to a delta approach where you filter the Shopify products for those that have been updated in a certain period of time, and then work through the same sequence of steps, creating the CSV files and uploading into Kibo. The updated_at_min
query parameter in Shopify (Learn more here).
All it takes is the modification of code in Step 1:
const currentDate = new Date();
const fifteenMinutesAgo = new Date(currentDate.getTime() - 15 * 60 * 1000);
const utcFifteenMinutesAgo = fifteenMinutesAgo.toISOString();
url = `https://${process.env.SHOPIFY_STORE_URL_US}/admin/api/${API_VERSION}/products.json?updated_at_min=${utcFifteenMinutesAgo}`;
For deployment, you can easily put this code into a Lambda function if using AWS, a Cloud Function in GCP, or an Azure function.
In general, the bulk import-export APIs are a powerful approach to importing large catalogs into Kibo at scale and can be used even for smaller updates by reusing the same code. You have flexibility, both in the structure of your catalog, as well as how you map the data. The same approach can be used for writing PIM integrations and any other integration involving bulk data import into Kibo where the resource is supported in the Kibo bulk APIs.