The purpose of StringToIntegerArrayNormalization is to create a data structure that is a ConsistantHashCodeChain#1.3.6.1.4.1.33097.1.0.4.3 that will map to a series of nested / bounded Ziggurats#1.3.6.1.4.1.33097.1.0.50 which make up a single index.
There are various manners which provide to integer array normalization, depending on the encoding of the characters in the strings (i.e. ASCII, UTF-8, UTF-16, etc). Regardless the first step is to get the strings into the encoding you want so that you can use that binary encoding as the integer representation. The integers in the array will always be positive (including zero) unsigned integers.
Next the bit size of the integers must be considered, because the binary data from the strings will need to be padded with binary zeros in the right most integer in the array.
Finally it is generally recommended to use String#1.3.6.1.4.1.33097.5.1 or WordNormalization#1.3.6.1.4.1.33097.5.1 before StringToIntegerArrayNormalization.
a : a represents the number of slots in the Array.
b : b represents the number of bits in the integers in the array. - Note: this could vary by Array slot, whoever to keep things simple in the example we have used fixed sized integers.
o : o represents the number of bytes (o for octets a synonym for bytes) for the total length of the string.
This can be accomplished by taking 1 + (o/8)/b = a.
For all elements except the last element, simply convert the bytes into integers and store them in the array.
On the right side of the remainder string bytes, add binary zero bits until you have enough bits to match the size b.