Chunk bytes randomly with min/max












3















I have a file with 1,000,000 Bytes. I want to chunk 200,000 bytes of the file randomly with a min and max chunk size while ensuring all 200,000 bytes are chunked without going under the min chunk byte size.



Ex:



const min = 20000, max = 50000
const result = [[0,20000], [25000,50000], [72000,110000], ...]


I have tried to wrap my mind around this for the last half hour. No luck finding any content on the internet.










share|improve this question























  • You might need to elaborate a bit. So the offset is random, as well as the chunk size? And the problem is that by doing it randomly, you don't know whether or not you'll hit the file end whilst still needing more data? You might also need to elaborate on what sort of random.

    – Matt Way
    Nov 25 '18 at 0:41













  • @MattWay I'm trying to generate random offset points to use for chunking the file bytes. I want to randomly chunk the bytes while preventing a offset from being generated without data.

    – Jake Cross
    Nov 25 '18 at 0:43













  • npmjs.com/package/random-extra

    – bluelovers
    Nov 25 '18 at 1:09
















3















I have a file with 1,000,000 Bytes. I want to chunk 200,000 bytes of the file randomly with a min and max chunk size while ensuring all 200,000 bytes are chunked without going under the min chunk byte size.



Ex:



const min = 20000, max = 50000
const result = [[0,20000], [25000,50000], [72000,110000], ...]


I have tried to wrap my mind around this for the last half hour. No luck finding any content on the internet.










share|improve this question























  • You might need to elaborate a bit. So the offset is random, as well as the chunk size? And the problem is that by doing it randomly, you don't know whether or not you'll hit the file end whilst still needing more data? You might also need to elaborate on what sort of random.

    – Matt Way
    Nov 25 '18 at 0:41













  • @MattWay I'm trying to generate random offset points to use for chunking the file bytes. I want to randomly chunk the bytes while preventing a offset from being generated without data.

    – Jake Cross
    Nov 25 '18 at 0:43













  • npmjs.com/package/random-extra

    – bluelovers
    Nov 25 '18 at 1:09














3












3








3








I have a file with 1,000,000 Bytes. I want to chunk 200,000 bytes of the file randomly with a min and max chunk size while ensuring all 200,000 bytes are chunked without going under the min chunk byte size.



Ex:



const min = 20000, max = 50000
const result = [[0,20000], [25000,50000], [72000,110000], ...]


I have tried to wrap my mind around this for the last half hour. No luck finding any content on the internet.










share|improve this question














I have a file with 1,000,000 Bytes. I want to chunk 200,000 bytes of the file randomly with a min and max chunk size while ensuring all 200,000 bytes are chunked without going under the min chunk byte size.



Ex:



const min = 20000, max = 50000
const result = [[0,20000], [25000,50000], [72000,110000], ...]


I have tried to wrap my mind around this for the last half hour. No luck finding any content on the internet.







javascript arrays node.js






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 25 '18 at 0:26









Jake CrossJake Cross

253




253













  • You might need to elaborate a bit. So the offset is random, as well as the chunk size? And the problem is that by doing it randomly, you don't know whether or not you'll hit the file end whilst still needing more data? You might also need to elaborate on what sort of random.

    – Matt Way
    Nov 25 '18 at 0:41













  • @MattWay I'm trying to generate random offset points to use for chunking the file bytes. I want to randomly chunk the bytes while preventing a offset from being generated without data.

    – Jake Cross
    Nov 25 '18 at 0:43













  • npmjs.com/package/random-extra

    – bluelovers
    Nov 25 '18 at 1:09



















  • You might need to elaborate a bit. So the offset is random, as well as the chunk size? And the problem is that by doing it randomly, you don't know whether or not you'll hit the file end whilst still needing more data? You might also need to elaborate on what sort of random.

    – Matt Way
    Nov 25 '18 at 0:41













  • @MattWay I'm trying to generate random offset points to use for chunking the file bytes. I want to randomly chunk the bytes while preventing a offset from being generated without data.

    – Jake Cross
    Nov 25 '18 at 0:43













  • npmjs.com/package/random-extra

    – bluelovers
    Nov 25 '18 at 1:09

















You might need to elaborate a bit. So the offset is random, as well as the chunk size? And the problem is that by doing it randomly, you don't know whether or not you'll hit the file end whilst still needing more data? You might also need to elaborate on what sort of random.

– Matt Way
Nov 25 '18 at 0:41







You might need to elaborate a bit. So the offset is random, as well as the chunk size? And the problem is that by doing it randomly, you don't know whether or not you'll hit the file end whilst still needing more data? You might also need to elaborate on what sort of random.

– Matt Way
Nov 25 '18 at 0:41















@MattWay I'm trying to generate random offset points to use for chunking the file bytes. I want to randomly chunk the bytes while preventing a offset from being generated without data.

– Jake Cross
Nov 25 '18 at 0:43







@MattWay I'm trying to generate random offset points to use for chunking the file bytes. I want to randomly chunk the bytes while preventing a offset from being generated without data.

– Jake Cross
Nov 25 '18 at 0:43















npmjs.com/package/random-extra

– bluelovers
Nov 25 '18 at 1:09





npmjs.com/package/random-extra

– bluelovers
Nov 25 '18 at 1:09












2 Answers
2






active

oldest

votes


















0














So if I understand correctly, you need to retrieve a subset of data, where the subset is broken up into blocks between a min/max size, and you to select both the offsets and block sizes at random. The trick is that you want to ensure that you dont run out of memory before the criteria is met.



Well lets start with determining the block sizes themselves. This is actually quite a difficult problem, because of the min chunk size constraint, and the need to sum exactly to some value. So you can imagine that there are N sets of possible chunk sizes to fit the desired total, where each chunk size can be any value within the constraint range. However, not every set will contain chunk sizes that adhere to the min/max constraint. I have hacked together a working example below, but beware that it brute forces a desirable answer by retrying in the event that a failed set is found.






const randInt = (min,max) => {
return Math.floor(Math.random() * (max - min + 1) + min)
}

const getSizeArray = (min, max, total) => {
const output =
var leftovers = total
while(leftovers > max){
const nextSize = randInt(min, max)
output.push(nextSize)
leftovers -= nextSize
}
// if the leftovers are less than min, this set is impossible
if(leftovers < min){ return getSizeArray(min, max, total) }
// cater to the final amount to get exact size
if(leftovers > 0){ output.push(leftovers) }
return output
}

const sizes = getSizeArray(20000, 50000, 200000)
console.log(sizes)





All we do here is keep choosing random sizes within the range, and taking the final value as the difference to the total constraint. I'm not exactly sure how this will effect the distribution though.





So once you have the chunk sizes, then you just need to work out the offsets. You could do this many ways, but the way I did it below tries to enforce fairly uniform gaps for the chunks. It works by splitting the mem total up by the number of chunks, and finding a range of the memory with which each chunk is free to randomly choose an offset from. For example:






const randInt = (min,max) => {
return Math.floor(Math.random() * (max - min + 1) + min)
}

const getSizeArray = (min, max, total) => {
const output =
var leftovers = total
while(leftovers > max){
const nextSize = randInt(min, max)
output.push(nextSize)
leftovers -= nextSize
}
// if the leftovers are less than min, this set is impossible
if(leftovers < min){ return getSizeArray(min, max, total) }
// cater to the final amount to get exact size
if(leftovers > 0){ output.push(leftovers) }
return output
}

const sizes = getSizeArray(20000, 50000, 200000)

const getOffsets = (arr, memSize) => {
const result =
const sum = arr.reduce((r, i) => r + i, 0)
const gap = (memSize - sum) / arr.length
arr.forEach((item, i) => {
const min = arr.reduce((r, a, index) => {
if(index < i){
return r + gap + a
}
return r
}, 0)
const max = min + Math.floor(gap / 2)
const offset = randInt(min, max)
result.push([offset, item])
})
return result
}

const result = getOffsets(sizes, 1000000)
console.log(result)








share|improve this answer































    0














    Just generate random values between 25000 and 50000, that sum up to 200000:



     const range = (min, max) => min + Math.floor(Math.random() * (max - min));

    function sizes(size, min, max) {
    const sizes = ;
    let pos;
    for(pos = 0; pos < size - max;) {
    const curr = pos + range(min, Math.min(max, size - max));
    sizes.push(curr);
    pos += curr;
    }

    sizes.push(size - pos);
    return sizes;
    }


    Now generate chunk sizes and the sizes of the parts between the chunks, then map them to indices:



    const toChunk = range(1000000 - 200000, 1000000), 
    leftOver = 1000000 - toChunk,
    chunkSizes = sizes(toChunk, 25000, 50000),
    spaceSizes = sizes(leftOver, 0, range(10, leftOver)),
    chunks = ;

    while(spaceSizes.length > chunkSizes.length + 1)
    spaceSizes.splice(range(0, spaceSizes.length), spaceSizes.pop() + spaceSizes.pop());

    let start = 0;
    for(const chunkSize of chunkSizes) {
    chunks.push([start, start + chunkSize - 1]);
    start += chunkSize;
    start += spaceSizes.pop() || 0;
    }





    share|improve this answer

























      Your Answer






      StackExchange.ifUsing("editor", function () {
      StackExchange.using("externalEditor", function () {
      StackExchange.using("snippets", function () {
      StackExchange.snippets.init();
      });
      });
      }, "code-snippets");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "1"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53463639%2fchunk-bytes-randomly-with-min-max%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      0














      So if I understand correctly, you need to retrieve a subset of data, where the subset is broken up into blocks between a min/max size, and you to select both the offsets and block sizes at random. The trick is that you want to ensure that you dont run out of memory before the criteria is met.



      Well lets start with determining the block sizes themselves. This is actually quite a difficult problem, because of the min chunk size constraint, and the need to sum exactly to some value. So you can imagine that there are N sets of possible chunk sizes to fit the desired total, where each chunk size can be any value within the constraint range. However, not every set will contain chunk sizes that adhere to the min/max constraint. I have hacked together a working example below, but beware that it brute forces a desirable answer by retrying in the event that a failed set is found.






      const randInt = (min,max) => {
      return Math.floor(Math.random() * (max - min + 1) + min)
      }

      const getSizeArray = (min, max, total) => {
      const output =
      var leftovers = total
      while(leftovers > max){
      const nextSize = randInt(min, max)
      output.push(nextSize)
      leftovers -= nextSize
      }
      // if the leftovers are less than min, this set is impossible
      if(leftovers < min){ return getSizeArray(min, max, total) }
      // cater to the final amount to get exact size
      if(leftovers > 0){ output.push(leftovers) }
      return output
      }

      const sizes = getSizeArray(20000, 50000, 200000)
      console.log(sizes)





      All we do here is keep choosing random sizes within the range, and taking the final value as the difference to the total constraint. I'm not exactly sure how this will effect the distribution though.





      So once you have the chunk sizes, then you just need to work out the offsets. You could do this many ways, but the way I did it below tries to enforce fairly uniform gaps for the chunks. It works by splitting the mem total up by the number of chunks, and finding a range of the memory with which each chunk is free to randomly choose an offset from. For example:






      const randInt = (min,max) => {
      return Math.floor(Math.random() * (max - min + 1) + min)
      }

      const getSizeArray = (min, max, total) => {
      const output =
      var leftovers = total
      while(leftovers > max){
      const nextSize = randInt(min, max)
      output.push(nextSize)
      leftovers -= nextSize
      }
      // if the leftovers are less than min, this set is impossible
      if(leftovers < min){ return getSizeArray(min, max, total) }
      // cater to the final amount to get exact size
      if(leftovers > 0){ output.push(leftovers) }
      return output
      }

      const sizes = getSizeArray(20000, 50000, 200000)

      const getOffsets = (arr, memSize) => {
      const result =
      const sum = arr.reduce((r, i) => r + i, 0)
      const gap = (memSize - sum) / arr.length
      arr.forEach((item, i) => {
      const min = arr.reduce((r, a, index) => {
      if(index < i){
      return r + gap + a
      }
      return r
      }, 0)
      const max = min + Math.floor(gap / 2)
      const offset = randInt(min, max)
      result.push([offset, item])
      })
      return result
      }

      const result = getOffsets(sizes, 1000000)
      console.log(result)








      share|improve this answer




























        0














        So if I understand correctly, you need to retrieve a subset of data, where the subset is broken up into blocks between a min/max size, and you to select both the offsets and block sizes at random. The trick is that you want to ensure that you dont run out of memory before the criteria is met.



        Well lets start with determining the block sizes themselves. This is actually quite a difficult problem, because of the min chunk size constraint, and the need to sum exactly to some value. So you can imagine that there are N sets of possible chunk sizes to fit the desired total, where each chunk size can be any value within the constraint range. However, not every set will contain chunk sizes that adhere to the min/max constraint. I have hacked together a working example below, but beware that it brute forces a desirable answer by retrying in the event that a failed set is found.






        const randInt = (min,max) => {
        return Math.floor(Math.random() * (max - min + 1) + min)
        }

        const getSizeArray = (min, max, total) => {
        const output =
        var leftovers = total
        while(leftovers > max){
        const nextSize = randInt(min, max)
        output.push(nextSize)
        leftovers -= nextSize
        }
        // if the leftovers are less than min, this set is impossible
        if(leftovers < min){ return getSizeArray(min, max, total) }
        // cater to the final amount to get exact size
        if(leftovers > 0){ output.push(leftovers) }
        return output
        }

        const sizes = getSizeArray(20000, 50000, 200000)
        console.log(sizes)





        All we do here is keep choosing random sizes within the range, and taking the final value as the difference to the total constraint. I'm not exactly sure how this will effect the distribution though.





        So once you have the chunk sizes, then you just need to work out the offsets. You could do this many ways, but the way I did it below tries to enforce fairly uniform gaps for the chunks. It works by splitting the mem total up by the number of chunks, and finding a range of the memory with which each chunk is free to randomly choose an offset from. For example:






        const randInt = (min,max) => {
        return Math.floor(Math.random() * (max - min + 1) + min)
        }

        const getSizeArray = (min, max, total) => {
        const output =
        var leftovers = total
        while(leftovers > max){
        const nextSize = randInt(min, max)
        output.push(nextSize)
        leftovers -= nextSize
        }
        // if the leftovers are less than min, this set is impossible
        if(leftovers < min){ return getSizeArray(min, max, total) }
        // cater to the final amount to get exact size
        if(leftovers > 0){ output.push(leftovers) }
        return output
        }

        const sizes = getSizeArray(20000, 50000, 200000)

        const getOffsets = (arr, memSize) => {
        const result =
        const sum = arr.reduce((r, i) => r + i, 0)
        const gap = (memSize - sum) / arr.length
        arr.forEach((item, i) => {
        const min = arr.reduce((r, a, index) => {
        if(index < i){
        return r + gap + a
        }
        return r
        }, 0)
        const max = min + Math.floor(gap / 2)
        const offset = randInt(min, max)
        result.push([offset, item])
        })
        return result
        }

        const result = getOffsets(sizes, 1000000)
        console.log(result)








        share|improve this answer


























          0












          0








          0







          So if I understand correctly, you need to retrieve a subset of data, where the subset is broken up into blocks between a min/max size, and you to select both the offsets and block sizes at random. The trick is that you want to ensure that you dont run out of memory before the criteria is met.



          Well lets start with determining the block sizes themselves. This is actually quite a difficult problem, because of the min chunk size constraint, and the need to sum exactly to some value. So you can imagine that there are N sets of possible chunk sizes to fit the desired total, where each chunk size can be any value within the constraint range. However, not every set will contain chunk sizes that adhere to the min/max constraint. I have hacked together a working example below, but beware that it brute forces a desirable answer by retrying in the event that a failed set is found.






          const randInt = (min,max) => {
          return Math.floor(Math.random() * (max - min + 1) + min)
          }

          const getSizeArray = (min, max, total) => {
          const output =
          var leftovers = total
          while(leftovers > max){
          const nextSize = randInt(min, max)
          output.push(nextSize)
          leftovers -= nextSize
          }
          // if the leftovers are less than min, this set is impossible
          if(leftovers < min){ return getSizeArray(min, max, total) }
          // cater to the final amount to get exact size
          if(leftovers > 0){ output.push(leftovers) }
          return output
          }

          const sizes = getSizeArray(20000, 50000, 200000)
          console.log(sizes)





          All we do here is keep choosing random sizes within the range, and taking the final value as the difference to the total constraint. I'm not exactly sure how this will effect the distribution though.





          So once you have the chunk sizes, then you just need to work out the offsets. You could do this many ways, but the way I did it below tries to enforce fairly uniform gaps for the chunks. It works by splitting the mem total up by the number of chunks, and finding a range of the memory with which each chunk is free to randomly choose an offset from. For example:






          const randInt = (min,max) => {
          return Math.floor(Math.random() * (max - min + 1) + min)
          }

          const getSizeArray = (min, max, total) => {
          const output =
          var leftovers = total
          while(leftovers > max){
          const nextSize = randInt(min, max)
          output.push(nextSize)
          leftovers -= nextSize
          }
          // if the leftovers are less than min, this set is impossible
          if(leftovers < min){ return getSizeArray(min, max, total) }
          // cater to the final amount to get exact size
          if(leftovers > 0){ output.push(leftovers) }
          return output
          }

          const sizes = getSizeArray(20000, 50000, 200000)

          const getOffsets = (arr, memSize) => {
          const result =
          const sum = arr.reduce((r, i) => r + i, 0)
          const gap = (memSize - sum) / arr.length
          arr.forEach((item, i) => {
          const min = arr.reduce((r, a, index) => {
          if(index < i){
          return r + gap + a
          }
          return r
          }, 0)
          const max = min + Math.floor(gap / 2)
          const offset = randInt(min, max)
          result.push([offset, item])
          })
          return result
          }

          const result = getOffsets(sizes, 1000000)
          console.log(result)








          share|improve this answer













          So if I understand correctly, you need to retrieve a subset of data, where the subset is broken up into blocks between a min/max size, and you to select both the offsets and block sizes at random. The trick is that you want to ensure that you dont run out of memory before the criteria is met.



          Well lets start with determining the block sizes themselves. This is actually quite a difficult problem, because of the min chunk size constraint, and the need to sum exactly to some value. So you can imagine that there are N sets of possible chunk sizes to fit the desired total, where each chunk size can be any value within the constraint range. However, not every set will contain chunk sizes that adhere to the min/max constraint. I have hacked together a working example below, but beware that it brute forces a desirable answer by retrying in the event that a failed set is found.






          const randInt = (min,max) => {
          return Math.floor(Math.random() * (max - min + 1) + min)
          }

          const getSizeArray = (min, max, total) => {
          const output =
          var leftovers = total
          while(leftovers > max){
          const nextSize = randInt(min, max)
          output.push(nextSize)
          leftovers -= nextSize
          }
          // if the leftovers are less than min, this set is impossible
          if(leftovers < min){ return getSizeArray(min, max, total) }
          // cater to the final amount to get exact size
          if(leftovers > 0){ output.push(leftovers) }
          return output
          }

          const sizes = getSizeArray(20000, 50000, 200000)
          console.log(sizes)





          All we do here is keep choosing random sizes within the range, and taking the final value as the difference to the total constraint. I'm not exactly sure how this will effect the distribution though.





          So once you have the chunk sizes, then you just need to work out the offsets. You could do this many ways, but the way I did it below tries to enforce fairly uniform gaps for the chunks. It works by splitting the mem total up by the number of chunks, and finding a range of the memory with which each chunk is free to randomly choose an offset from. For example:






          const randInt = (min,max) => {
          return Math.floor(Math.random() * (max - min + 1) + min)
          }

          const getSizeArray = (min, max, total) => {
          const output =
          var leftovers = total
          while(leftovers > max){
          const nextSize = randInt(min, max)
          output.push(nextSize)
          leftovers -= nextSize
          }
          // if the leftovers are less than min, this set is impossible
          if(leftovers < min){ return getSizeArray(min, max, total) }
          // cater to the final amount to get exact size
          if(leftovers > 0){ output.push(leftovers) }
          return output
          }

          const sizes = getSizeArray(20000, 50000, 200000)

          const getOffsets = (arr, memSize) => {
          const result =
          const sum = arr.reduce((r, i) => r + i, 0)
          const gap = (memSize - sum) / arr.length
          arr.forEach((item, i) => {
          const min = arr.reduce((r, a, index) => {
          if(index < i){
          return r + gap + a
          }
          return r
          }, 0)
          const max = min + Math.floor(gap / 2)
          const offset = randInt(min, max)
          result.push([offset, item])
          })
          return result
          }

          const result = getOffsets(sizes, 1000000)
          console.log(result)








          const randInt = (min,max) => {
          return Math.floor(Math.random() * (max - min + 1) + min)
          }

          const getSizeArray = (min, max, total) => {
          const output =
          var leftovers = total
          while(leftovers > max){
          const nextSize = randInt(min, max)
          output.push(nextSize)
          leftovers -= nextSize
          }
          // if the leftovers are less than min, this set is impossible
          if(leftovers < min){ return getSizeArray(min, max, total) }
          // cater to the final amount to get exact size
          if(leftovers > 0){ output.push(leftovers) }
          return output
          }

          const sizes = getSizeArray(20000, 50000, 200000)
          console.log(sizes)





          const randInt = (min,max) => {
          return Math.floor(Math.random() * (max - min + 1) + min)
          }

          const getSizeArray = (min, max, total) => {
          const output =
          var leftovers = total
          while(leftovers > max){
          const nextSize = randInt(min, max)
          output.push(nextSize)
          leftovers -= nextSize
          }
          // if the leftovers are less than min, this set is impossible
          if(leftovers < min){ return getSizeArray(min, max, total) }
          // cater to the final amount to get exact size
          if(leftovers > 0){ output.push(leftovers) }
          return output
          }

          const sizes = getSizeArray(20000, 50000, 200000)
          console.log(sizes)





          const randInt = (min,max) => {
          return Math.floor(Math.random() * (max - min + 1) + min)
          }

          const getSizeArray = (min, max, total) => {
          const output =
          var leftovers = total
          while(leftovers > max){
          const nextSize = randInt(min, max)
          output.push(nextSize)
          leftovers -= nextSize
          }
          // if the leftovers are less than min, this set is impossible
          if(leftovers < min){ return getSizeArray(min, max, total) }
          // cater to the final amount to get exact size
          if(leftovers > 0){ output.push(leftovers) }
          return output
          }

          const sizes = getSizeArray(20000, 50000, 200000)

          const getOffsets = (arr, memSize) => {
          const result =
          const sum = arr.reduce((r, i) => r + i, 0)
          const gap = (memSize - sum) / arr.length
          arr.forEach((item, i) => {
          const min = arr.reduce((r, a, index) => {
          if(index < i){
          return r + gap + a
          }
          return r
          }, 0)
          const max = min + Math.floor(gap / 2)
          const offset = randInt(min, max)
          result.push([offset, item])
          })
          return result
          }

          const result = getOffsets(sizes, 1000000)
          console.log(result)





          const randInt = (min,max) => {
          return Math.floor(Math.random() * (max - min + 1) + min)
          }

          const getSizeArray = (min, max, total) => {
          const output =
          var leftovers = total
          while(leftovers > max){
          const nextSize = randInt(min, max)
          output.push(nextSize)
          leftovers -= nextSize
          }
          // if the leftovers are less than min, this set is impossible
          if(leftovers < min){ return getSizeArray(min, max, total) }
          // cater to the final amount to get exact size
          if(leftovers > 0){ output.push(leftovers) }
          return output
          }

          const sizes = getSizeArray(20000, 50000, 200000)

          const getOffsets = (arr, memSize) => {
          const result =
          const sum = arr.reduce((r, i) => r + i, 0)
          const gap = (memSize - sum) / arr.length
          arr.forEach((item, i) => {
          const min = arr.reduce((r, a, index) => {
          if(index < i){
          return r + gap + a
          }
          return r
          }, 0)
          const max = min + Math.floor(gap / 2)
          const offset = randInt(min, max)
          result.push([offset, item])
          })
          return result
          }

          const result = getOffsets(sizes, 1000000)
          console.log(result)






          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 25 '18 at 5:12









          Matt WayMatt Way

          23k76069




          23k76069

























              0














              Just generate random values between 25000 and 50000, that sum up to 200000:



               const range = (min, max) => min + Math.floor(Math.random() * (max - min));

              function sizes(size, min, max) {
              const sizes = ;
              let pos;
              for(pos = 0; pos < size - max;) {
              const curr = pos + range(min, Math.min(max, size - max));
              sizes.push(curr);
              pos += curr;
              }

              sizes.push(size - pos);
              return sizes;
              }


              Now generate chunk sizes and the sizes of the parts between the chunks, then map them to indices:



              const toChunk = range(1000000 - 200000, 1000000), 
              leftOver = 1000000 - toChunk,
              chunkSizes = sizes(toChunk, 25000, 50000),
              spaceSizes = sizes(leftOver, 0, range(10, leftOver)),
              chunks = ;

              while(spaceSizes.length > chunkSizes.length + 1)
              spaceSizes.splice(range(0, spaceSizes.length), spaceSizes.pop() + spaceSizes.pop());

              let start = 0;
              for(const chunkSize of chunkSizes) {
              chunks.push([start, start + chunkSize - 1]);
              start += chunkSize;
              start += spaceSizes.pop() || 0;
              }





              share|improve this answer






























                0














                Just generate random values between 25000 and 50000, that sum up to 200000:



                 const range = (min, max) => min + Math.floor(Math.random() * (max - min));

                function sizes(size, min, max) {
                const sizes = ;
                let pos;
                for(pos = 0; pos < size - max;) {
                const curr = pos + range(min, Math.min(max, size - max));
                sizes.push(curr);
                pos += curr;
                }

                sizes.push(size - pos);
                return sizes;
                }


                Now generate chunk sizes and the sizes of the parts between the chunks, then map them to indices:



                const toChunk = range(1000000 - 200000, 1000000), 
                leftOver = 1000000 - toChunk,
                chunkSizes = sizes(toChunk, 25000, 50000),
                spaceSizes = sizes(leftOver, 0, range(10, leftOver)),
                chunks = ;

                while(spaceSizes.length > chunkSizes.length + 1)
                spaceSizes.splice(range(0, spaceSizes.length), spaceSizes.pop() + spaceSizes.pop());

                let start = 0;
                for(const chunkSize of chunkSizes) {
                chunks.push([start, start + chunkSize - 1]);
                start += chunkSize;
                start += spaceSizes.pop() || 0;
                }





                share|improve this answer




























                  0












                  0








                  0







                  Just generate random values between 25000 and 50000, that sum up to 200000:



                   const range = (min, max) => min + Math.floor(Math.random() * (max - min));

                  function sizes(size, min, max) {
                  const sizes = ;
                  let pos;
                  for(pos = 0; pos < size - max;) {
                  const curr = pos + range(min, Math.min(max, size - max));
                  sizes.push(curr);
                  pos += curr;
                  }

                  sizes.push(size - pos);
                  return sizes;
                  }


                  Now generate chunk sizes and the sizes of the parts between the chunks, then map them to indices:



                  const toChunk = range(1000000 - 200000, 1000000), 
                  leftOver = 1000000 - toChunk,
                  chunkSizes = sizes(toChunk, 25000, 50000),
                  spaceSizes = sizes(leftOver, 0, range(10, leftOver)),
                  chunks = ;

                  while(spaceSizes.length > chunkSizes.length + 1)
                  spaceSizes.splice(range(0, spaceSizes.length), spaceSizes.pop() + spaceSizes.pop());

                  let start = 0;
                  for(const chunkSize of chunkSizes) {
                  chunks.push([start, start + chunkSize - 1]);
                  start += chunkSize;
                  start += spaceSizes.pop() || 0;
                  }





                  share|improve this answer















                  Just generate random values between 25000 and 50000, that sum up to 200000:



                   const range = (min, max) => min + Math.floor(Math.random() * (max - min));

                  function sizes(size, min, max) {
                  const sizes = ;
                  let pos;
                  for(pos = 0; pos < size - max;) {
                  const curr = pos + range(min, Math.min(max, size - max));
                  sizes.push(curr);
                  pos += curr;
                  }

                  sizes.push(size - pos);
                  return sizes;
                  }


                  Now generate chunk sizes and the sizes of the parts between the chunks, then map them to indices:



                  const toChunk = range(1000000 - 200000, 1000000), 
                  leftOver = 1000000 - toChunk,
                  chunkSizes = sizes(toChunk, 25000, 50000),
                  spaceSizes = sizes(leftOver, 0, range(10, leftOver)),
                  chunks = ;

                  while(spaceSizes.length > chunkSizes.length + 1)
                  spaceSizes.splice(range(0, spaceSizes.length), spaceSizes.pop() + spaceSizes.pop());

                  let start = 0;
                  for(const chunkSize of chunkSizes) {
                  chunks.push([start, start + chunkSize - 1]);
                  start += chunkSize;
                  start += spaceSizes.pop() || 0;
                  }






                  share|improve this answer














                  share|improve this answer



                  share|improve this answer








                  edited Nov 25 '18 at 10:56

























                  answered Nov 25 '18 at 10:21









                  Jonas WilmsJonas Wilms

                  60.8k53255




                  60.8k53255






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Stack Overflow!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53463639%2fchunk-bytes-randomly-with-min-max%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Wiesbaden

                      Marschland

                      Dieringhausen