Learn how data streaming pipelines and natural language generation (NLG) can increase your page traffic by automatically publishing product descriptions and meta tags in real time.
Although we are an early stage startup, we have already worked on content automation projects with known brands that have several million dollars in e-commerce revenue per year. Through smart content automation they have found themselves with traffic increases from 25% up to a shocking 95% (double revenue!).
When it comes to automating your content, streaming your product data through several systems has an increasingly important role. You don't want to wait hours, days or even weeks until your store's contents are updated, do you? Have you ever manually imported data through Excel imports and want to avoid manual labor? Then this article is for you.
Of course our simple example can be scaled to thousands of products.
First of all, let's have a look at the product's data attributes that we used. The following listing shows an example extract of product data that is stored in Shopify:
As you can see from the JSON document, our product is a mobile laptop stand. The product data - if already uploaded - can be accessed via GET request:
https://<YOUR-SHOP-NAME>.myshopify.com/admin/api/2021-04/products.json
For further details please refer to the Shopify Product API.
In this case we already have our data uploaded and listed in our Shopify store. This is probably the most common case for Shopify store owners.
Usually, the tedious part of maintaining your product pages is filling it with content. Describing your product will increase your page traffic and conversion rates, that's for sure. This can be achieved by outlining unique selling points, addressing your audience on a personal level or filling your page with relevant SEO keywords, just to name a few possibilities.
Writing all that for a single product? No problem.
For hundreds or thousands of products? You might waste months until building organic page traffic.
That's where software tools come into play that use NLG, an AI-based technology to generate content in natural language. We are using AX Semantics' NLG Cloud, an AWS hosted AI engine that allows us to generate grammatically correct and highly personalized text from structured data - even in multiple languages.
Now, what we need are following texts:
And optimally, these texts automatically end up in the web store.
Currently, we have two things at hand: our product data in Shopify along with their product API as well as our NLG software to deliver high quality - but automated - text results.
That's were data streaming comes into play. We like to use streaming pipelines from our partner DataCater because they are easy to handle and:
Generally speaking we can flexibly configure regular and irregular data transfers from virtually any source to any sink that offers connectors.
By creating depicted pipelines that listen to any changes in our Shopify products collection we can ensure that your product texts are always up to date with a minimal delay.
After reading until here you must be curious about the outcome. Here is an example screenshot taken from our Shopify page preview.
We included information such as:
With this you don't have to manually create or update your products texts anymore whenever you:
And the best part? Setting up a data streaming pipeline with DataCater usually doesn't take longer than a few minutes. With just a few hundreds of articles you will feel a significant improvement.
Increase your content production speed by magnitudes, draw organic page visitors to your shop and increase your sales.