Scala Source Code: How to understand zipWithIndex












1















I want to implement my custom function minWithIndex, so I looked into zipWithIndex function from scala library.
I am having trouble understanding.



I know what the function does, but cannot figure out how it does it.
Thank in advance



def zipWithIndex[A1 >: A, That](implicit bf: CanBuildFrom[Repr, (A1, Int), That]): That = {
val b = bf(repr)
var i = 0
for (x <- this) {
b += ((x, i))
i += 1
}
b.result()
}









share|improve this question


















  • 2





    "I know what the function does, but cannot figure out how it does it"... welcome to the Scala standard libs :)

    – Lasf
    Nov 23 '18 at 0:53
















1















I want to implement my custom function minWithIndex, so I looked into zipWithIndex function from scala library.
I am having trouble understanding.



I know what the function does, but cannot figure out how it does it.
Thank in advance



def zipWithIndex[A1 >: A, That](implicit bf: CanBuildFrom[Repr, (A1, Int), That]): That = {
val b = bf(repr)
var i = 0
for (x <- this) {
b += ((x, i))
i += 1
}
b.result()
}









share|improve this question


















  • 2





    "I know what the function does, but cannot figure out how it does it"... welcome to the Scala standard libs :)

    – Lasf
    Nov 23 '18 at 0:53














1












1








1








I want to implement my custom function minWithIndex, so I looked into zipWithIndex function from scala library.
I am having trouble understanding.



I know what the function does, but cannot figure out how it does it.
Thank in advance



def zipWithIndex[A1 >: A, That](implicit bf: CanBuildFrom[Repr, (A1, Int), That]): That = {
val b = bf(repr)
var i = 0
for (x <- this) {
b += ((x, i))
i += 1
}
b.result()
}









share|improve this question














I want to implement my custom function minWithIndex, so I looked into zipWithIndex function from scala library.
I am having trouble understanding.



I know what the function does, but cannot figure out how it does it.
Thank in advance



def zipWithIndex[A1 >: A, That](implicit bf: CanBuildFrom[Repr, (A1, Int), That]): That = {
val b = bf(repr)
var i = 0
for (x <- this) {
b += ((x, i))
i += 1
}
b.result()
}






scala function typeclass






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 22 '18 at 22:25









Ankush SinghAnkush Singh

132




132








  • 2





    "I know what the function does, but cannot figure out how it does it"... welcome to the Scala standard libs :)

    – Lasf
    Nov 23 '18 at 0:53














  • 2





    "I know what the function does, but cannot figure out how it does it"... welcome to the Scala standard libs :)

    – Lasf
    Nov 23 '18 at 0:53








2




2





"I know what the function does, but cannot figure out how it does it"... welcome to the Scala standard libs :)

– Lasf
Nov 23 '18 at 0:53





"I know what the function does, but cannot figure out how it does it"... welcome to the Scala standard libs :)

– Lasf
Nov 23 '18 at 0:53












2 Answers
2






active

oldest

votes


















1














Your question looks like actually two different questions:




  1. How Scala standard library collections are designed and particularly how zipWithIndex works?


  2. How to implement my custom minWithIndex?



Unfortunately the first question is probably more complicated than the second one.



Scala collections design



As of version 2.12 current scala.collection is designed to be very flexible but it comes with a cost of using many advanced features and/or design patterns. Also one might argue that it is over-engineered for the real-world usages and thus it is hard to understand for a Scala novice.



Designing collections library is a known hard problem because ideally you want to capture many different aspects. The main things is that you probably want to capture as many aspects as possible but at the same time has as little code duplication as possible. (For example consider linked-list vs array-based list. Both probably should have method like indexOf that can be implemented in terms of size and getByIndex so ideally you'd like to have only one implementation here.) This is a real design challenge.



Here is a non-exhaustive list of aspects that affected Scala's collections design (you may also find some design notes on the earlier (2.8) version here):




  • Unified interfaces for different data structures implementing similar things (for example linked-list vs. array-based list are very similar while linked-list vs tree-based set are much less similar but still have something common). This is the reason for various traits such as Seq or GenTraversableLike and deep inheritance hierarchy.


  • Type-safety. We want a list of integers to be a different type from a list of strings. And we'd like to know the type of the stored elements. This is why all the collections are generic.


  • Performance. In every language standard collections are one of the most used pieces of code so good performance is very important. This is the reason why in several places there is an "impure" mutable implementation for a pure immutable FP interfaces.


  • Separate read-only and mutable collections (especially important in the FP-languages but in others as well). This is why there are scala.collection, scala.collection.mutable and scala.collection.immutable packages.


  • Support for potentially unlimited sequences such as generated (Fibonacci numbers) or coming from an external source (such as sequence of bytes read from console input). This is the reason for things like TraversableOnce or Iterable.


  • Support for sequences that can't be (easily) processed twice. For example a stream of all the deals on some major stock exchange is so huge that it can't be stored and re-processed. This is the reason for TraversableOnce vs Traversable


  • Internal vs external iteration (this is the reason of separation between Iterable vs Traversable)


  • Eager vs. lazy computations (this is the reason for "View"-suffixed types)


  • Support of the collection-like objects that are not collections. For example, you are implementing HTML-parser or even browser. Many HTML-elements can have children but being a collection of children is not the main responsibility. This is the reason for various traits with 'Gen` prefix.



Now let's get back to zipWithIndex. This is one of many methods (other similar ones are map, fitler and many others) that pose additional challenges for the designers: those methods produce new collection from the current one. On the one hand such methods can be implemented generically one time for all collections, on the other hand with naive implementation we will lose specific types and even worse we might be forced to change implementation and thus probably semantics. Consider this naive implementation of filter method:



trait Iterable[A] {
def filter(p: A => Boolean): ? = {
var res: List[A] = Nil
for (x <- this)
if (p(x)) res = x :: res
res
}
}


What return type to put instead of the ?? If we put just Iterable[A] it means that we immediately lost the ability to use all the methods of the List that are not available in the Iterable (such as access by index). But another thing is what if our Iterable actually was a Set. We probably want the result of the filter to be a Set again instead of a List. This is the thing that the reference above article calls the "same-result-type principle". Scala is one of the few languages that allows you to design an elegant solution for this problem with minimal code duplication. The main idea is to have a companion object for each implementation (for example Iterable and immutable.List) and also a companion trait for each intermediate trait (for example SeqFactory. One of the crucial things those companions provide is an ability to create a new instance of the "same" collection (probably with a different generic type).



And the last thing that is not covered in the referenced article is the CanBuildFrom implicit parameter. The logic behind it is following: assume you have a linked-list (List) and want to have an array-based list (Vector) zipped with index. Do you have to create an intermediate linked-list zipped with index in the process? CanBuildFrom is the trick that allows the answer to be: No, you don't have to. Implicit parameters is a fairly advanced feature of Scala and you probably should read more on it if you are not already familiar. The idea is that the compiler can automatically search for a parameter of matching type. So if there is any value matching the type is the scope, the code will compile. It means that the presence of a CanBuildFrom can be used as an evidence that you can change collection's underlying data structure at the same time as you do some data manipulation (zipWithIndex, map, filter, etc.). Each companion object provides default CanBuildFrom value with the same target data structure (for example see immutable.List.canBuildFrom).



So now we can see how IterableLike.zipWithIndex, which is the default implementation for all subtypes, is implemented



  def zipWithIndex[A1 >: A, That](implicit bf: CanBuildFrom[Repr, (A1, Int), That]): That = {
val b = bf(repr)
var i = 0
for (x <- this) {
b += ((x, i))
i += 1
}
b.result()
}


The signature says that using zipWithIndex you can convert your collection Repr of A into That - a collection of tuples (A1, Int) as long as





  1. A1 is a supertype of A (which is a safe upcast)

  2. You have an evidence that you can build That from Repr (the bf object)


And what the method does is it asks the bf evidence to create b - a new Builder[(A1, Int), That], then fill the builder with tuples, and finally return the That collection from the builder (b.result()). Both builder and for with var i are used for performance reasons.



How to implement your own minWithIndex



The answer depends on how generic you want it to be and how much performance is important for you. Here are some options:



implicit class MinWithIndexOps[A, Repr <: IterableLike[A, Repr]](it: IterableLike[A, Repr]) {
def minWithIndexWithZip[B >: A](implicit cmp: Ordering[B], bf: CanBuildFrom[Repr, (A, Int), Iterable[(A, Int)]]): (A, Int) = it.zipWithIndex(bf).min(Ordering.by[(A, Int), B]((kv: (A, Int)) => kv._1))

def minWithIndexWithFold[B >: A](implicit cmp: Ordering[B]): (A, Int) = {
if (it.isEmpty)
throw new UnsupportedOperationException("empty.min")

val h = it.head
val res = it.foldLeft((h, 0, 0))((acc, cur) => acc match {
case (minVal, minIndex, curIndex) => if (cmp.lteq(minVal, cur)) (minVal, minIndex, curIndex + 1) else (cur, curIndex, curIndex + 1)
})
(res._1, res._2)
}

def minWithIndexWithVars[B >: A](implicit cmp: Ordering[B]): (A, Int) = {
if (it.isEmpty)
throw new UnsupportedOperationException("empty.min")

var minVal = it.head
var minIndex = 0
var i = 0
for (cur <- it) {
if (cmp.gt(minVal, cur)) {
minVal = cur
minIndex = i
}
i += 1
}
(minVal, minIndex)
}
}


def test(): Unit = {
val data = "qwerty" :: "asdfg" :: "zxcvb" :: Nil

println(data.minWithIndexWithZip)
println(data.minWithIndexWithFold)
println(data.minWithIndexWithVars)
}


Note that here implicit keyword is used in a different meaning to create an implicit class that effectively extends every instance of IterableLike with new methods.



First implementation minWithIndexWithZip is very straightforward but probably very inefficient: you literally first do zipWithIndex and then call min (note also that the standard min uses an unusual for Scala convention of throwing exception instead of returning Option[A] and I used the same semantics in other implementations). This method is inefficient because you have to create whole new collection just to be disposed almost immediately.



minWithIndexWithFold is a different implementation that does not rely on zipWithIndex and uses foldLeft instead. fold is at the same time a very basic and very generic operation. One benefit of this code is that it is also immutable. But because of that its performance is also not good: a lot of intermediate accumulator objects will be created and disposed almost immediately.



The last minWithIndexWithVars is probably the least "pure" but the most performant version that uses straightforward imperative-like code.






share|improve this answer































    0














    I have seen scala using mutable solutions sometimes internally. Same with their implementation of zipWithIndex.



    It is basically :




    • iterating over the collection (for loop)

    • for each element creating a tuple (a, Int) and

    • appending them to mutable mutable buffer (b += ((x, i)))


    So, for input List("first", "second"), you get List(("first", 1), ("second", 2)) after zipWithIndex.



    scala> List("first", "second").zipWithIndex
    res3: List[(String, Int)] = List((first,0), (second,1))


    You can solve the same using recursive approach with immutability,



    object MyZippedCollection {

    def main(args: Array[String]): Unit = {

    def zipWithIndex[a](list: List[a]): List[(a, Int)] = {

    def zipWith(index: Int, acc: List[(a, Int)], list: List[a]): List[(a, Int)] = {
    if (list.isEmpty) acc
    else zipWith(index + 1, acc :+ (list.head, index), list.tail)
    }

    zipWith(0, List.empty[(a, Int)], list)
    }

    val myList = List("scala", "fp", "haskell")

    zipWithIndex(myList).foreach { case (value, index) =>
    println(s"[$index]: $value")
    }
    }
    }





    share|improve this answer

























      Your Answer






      StackExchange.ifUsing("editor", function () {
      StackExchange.using("externalEditor", function () {
      StackExchange.using("snippets", function () {
      StackExchange.snippets.init();
      });
      });
      }, "code-snippets");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "1"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53438631%2fscala-source-code-how-to-understand-zipwithindex%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      1














      Your question looks like actually two different questions:




      1. How Scala standard library collections are designed and particularly how zipWithIndex works?


      2. How to implement my custom minWithIndex?



      Unfortunately the first question is probably more complicated than the second one.



      Scala collections design



      As of version 2.12 current scala.collection is designed to be very flexible but it comes with a cost of using many advanced features and/or design patterns. Also one might argue that it is over-engineered for the real-world usages and thus it is hard to understand for a Scala novice.



      Designing collections library is a known hard problem because ideally you want to capture many different aspects. The main things is that you probably want to capture as many aspects as possible but at the same time has as little code duplication as possible. (For example consider linked-list vs array-based list. Both probably should have method like indexOf that can be implemented in terms of size and getByIndex so ideally you'd like to have only one implementation here.) This is a real design challenge.



      Here is a non-exhaustive list of aspects that affected Scala's collections design (you may also find some design notes on the earlier (2.8) version here):




      • Unified interfaces for different data structures implementing similar things (for example linked-list vs. array-based list are very similar while linked-list vs tree-based set are much less similar but still have something common). This is the reason for various traits such as Seq or GenTraversableLike and deep inheritance hierarchy.


      • Type-safety. We want a list of integers to be a different type from a list of strings. And we'd like to know the type of the stored elements. This is why all the collections are generic.


      • Performance. In every language standard collections are one of the most used pieces of code so good performance is very important. This is the reason why in several places there is an "impure" mutable implementation for a pure immutable FP interfaces.


      • Separate read-only and mutable collections (especially important in the FP-languages but in others as well). This is why there are scala.collection, scala.collection.mutable and scala.collection.immutable packages.


      • Support for potentially unlimited sequences such as generated (Fibonacci numbers) or coming from an external source (such as sequence of bytes read from console input). This is the reason for things like TraversableOnce or Iterable.


      • Support for sequences that can't be (easily) processed twice. For example a stream of all the deals on some major stock exchange is so huge that it can't be stored and re-processed. This is the reason for TraversableOnce vs Traversable


      • Internal vs external iteration (this is the reason of separation between Iterable vs Traversable)


      • Eager vs. lazy computations (this is the reason for "View"-suffixed types)


      • Support of the collection-like objects that are not collections. For example, you are implementing HTML-parser or even browser. Many HTML-elements can have children but being a collection of children is not the main responsibility. This is the reason for various traits with 'Gen` prefix.



      Now let's get back to zipWithIndex. This is one of many methods (other similar ones are map, fitler and many others) that pose additional challenges for the designers: those methods produce new collection from the current one. On the one hand such methods can be implemented generically one time for all collections, on the other hand with naive implementation we will lose specific types and even worse we might be forced to change implementation and thus probably semantics. Consider this naive implementation of filter method:



      trait Iterable[A] {
      def filter(p: A => Boolean): ? = {
      var res: List[A] = Nil
      for (x <- this)
      if (p(x)) res = x :: res
      res
      }
      }


      What return type to put instead of the ?? If we put just Iterable[A] it means that we immediately lost the ability to use all the methods of the List that are not available in the Iterable (such as access by index). But another thing is what if our Iterable actually was a Set. We probably want the result of the filter to be a Set again instead of a List. This is the thing that the reference above article calls the "same-result-type principle". Scala is one of the few languages that allows you to design an elegant solution for this problem with minimal code duplication. The main idea is to have a companion object for each implementation (for example Iterable and immutable.List) and also a companion trait for each intermediate trait (for example SeqFactory. One of the crucial things those companions provide is an ability to create a new instance of the "same" collection (probably with a different generic type).



      And the last thing that is not covered in the referenced article is the CanBuildFrom implicit parameter. The logic behind it is following: assume you have a linked-list (List) and want to have an array-based list (Vector) zipped with index. Do you have to create an intermediate linked-list zipped with index in the process? CanBuildFrom is the trick that allows the answer to be: No, you don't have to. Implicit parameters is a fairly advanced feature of Scala and you probably should read more on it if you are not already familiar. The idea is that the compiler can automatically search for a parameter of matching type. So if there is any value matching the type is the scope, the code will compile. It means that the presence of a CanBuildFrom can be used as an evidence that you can change collection's underlying data structure at the same time as you do some data manipulation (zipWithIndex, map, filter, etc.). Each companion object provides default CanBuildFrom value with the same target data structure (for example see immutable.List.canBuildFrom).



      So now we can see how IterableLike.zipWithIndex, which is the default implementation for all subtypes, is implemented



        def zipWithIndex[A1 >: A, That](implicit bf: CanBuildFrom[Repr, (A1, Int), That]): That = {
      val b = bf(repr)
      var i = 0
      for (x <- this) {
      b += ((x, i))
      i += 1
      }
      b.result()
      }


      The signature says that using zipWithIndex you can convert your collection Repr of A into That - a collection of tuples (A1, Int) as long as





      1. A1 is a supertype of A (which is a safe upcast)

      2. You have an evidence that you can build That from Repr (the bf object)


      And what the method does is it asks the bf evidence to create b - a new Builder[(A1, Int), That], then fill the builder with tuples, and finally return the That collection from the builder (b.result()). Both builder and for with var i are used for performance reasons.



      How to implement your own minWithIndex



      The answer depends on how generic you want it to be and how much performance is important for you. Here are some options:



      implicit class MinWithIndexOps[A, Repr <: IterableLike[A, Repr]](it: IterableLike[A, Repr]) {
      def minWithIndexWithZip[B >: A](implicit cmp: Ordering[B], bf: CanBuildFrom[Repr, (A, Int), Iterable[(A, Int)]]): (A, Int) = it.zipWithIndex(bf).min(Ordering.by[(A, Int), B]((kv: (A, Int)) => kv._1))

      def minWithIndexWithFold[B >: A](implicit cmp: Ordering[B]): (A, Int) = {
      if (it.isEmpty)
      throw new UnsupportedOperationException("empty.min")

      val h = it.head
      val res = it.foldLeft((h, 0, 0))((acc, cur) => acc match {
      case (minVal, minIndex, curIndex) => if (cmp.lteq(minVal, cur)) (minVal, minIndex, curIndex + 1) else (cur, curIndex, curIndex + 1)
      })
      (res._1, res._2)
      }

      def minWithIndexWithVars[B >: A](implicit cmp: Ordering[B]): (A, Int) = {
      if (it.isEmpty)
      throw new UnsupportedOperationException("empty.min")

      var minVal = it.head
      var minIndex = 0
      var i = 0
      for (cur <- it) {
      if (cmp.gt(minVal, cur)) {
      minVal = cur
      minIndex = i
      }
      i += 1
      }
      (minVal, minIndex)
      }
      }


      def test(): Unit = {
      val data = "qwerty" :: "asdfg" :: "zxcvb" :: Nil

      println(data.minWithIndexWithZip)
      println(data.minWithIndexWithFold)
      println(data.minWithIndexWithVars)
      }


      Note that here implicit keyword is used in a different meaning to create an implicit class that effectively extends every instance of IterableLike with new methods.



      First implementation minWithIndexWithZip is very straightforward but probably very inefficient: you literally first do zipWithIndex and then call min (note also that the standard min uses an unusual for Scala convention of throwing exception instead of returning Option[A] and I used the same semantics in other implementations). This method is inefficient because you have to create whole new collection just to be disposed almost immediately.



      minWithIndexWithFold is a different implementation that does not rely on zipWithIndex and uses foldLeft instead. fold is at the same time a very basic and very generic operation. One benefit of this code is that it is also immutable. But because of that its performance is also not good: a lot of intermediate accumulator objects will be created and disposed almost immediately.



      The last minWithIndexWithVars is probably the least "pure" but the most performant version that uses straightforward imperative-like code.






      share|improve this answer




























        1














        Your question looks like actually two different questions:




        1. How Scala standard library collections are designed and particularly how zipWithIndex works?


        2. How to implement my custom minWithIndex?



        Unfortunately the first question is probably more complicated than the second one.



        Scala collections design



        As of version 2.12 current scala.collection is designed to be very flexible but it comes with a cost of using many advanced features and/or design patterns. Also one might argue that it is over-engineered for the real-world usages and thus it is hard to understand for a Scala novice.



        Designing collections library is a known hard problem because ideally you want to capture many different aspects. The main things is that you probably want to capture as many aspects as possible but at the same time has as little code duplication as possible. (For example consider linked-list vs array-based list. Both probably should have method like indexOf that can be implemented in terms of size and getByIndex so ideally you'd like to have only one implementation here.) This is a real design challenge.



        Here is a non-exhaustive list of aspects that affected Scala's collections design (you may also find some design notes on the earlier (2.8) version here):




        • Unified interfaces for different data structures implementing similar things (for example linked-list vs. array-based list are very similar while linked-list vs tree-based set are much less similar but still have something common). This is the reason for various traits such as Seq or GenTraversableLike and deep inheritance hierarchy.


        • Type-safety. We want a list of integers to be a different type from a list of strings. And we'd like to know the type of the stored elements. This is why all the collections are generic.


        • Performance. In every language standard collections are one of the most used pieces of code so good performance is very important. This is the reason why in several places there is an "impure" mutable implementation for a pure immutable FP interfaces.


        • Separate read-only and mutable collections (especially important in the FP-languages but in others as well). This is why there are scala.collection, scala.collection.mutable and scala.collection.immutable packages.


        • Support for potentially unlimited sequences such as generated (Fibonacci numbers) or coming from an external source (such as sequence of bytes read from console input). This is the reason for things like TraversableOnce or Iterable.


        • Support for sequences that can't be (easily) processed twice. For example a stream of all the deals on some major stock exchange is so huge that it can't be stored and re-processed. This is the reason for TraversableOnce vs Traversable


        • Internal vs external iteration (this is the reason of separation between Iterable vs Traversable)


        • Eager vs. lazy computations (this is the reason for "View"-suffixed types)


        • Support of the collection-like objects that are not collections. For example, you are implementing HTML-parser or even browser. Many HTML-elements can have children but being a collection of children is not the main responsibility. This is the reason for various traits with 'Gen` prefix.



        Now let's get back to zipWithIndex. This is one of many methods (other similar ones are map, fitler and many others) that pose additional challenges for the designers: those methods produce new collection from the current one. On the one hand such methods can be implemented generically one time for all collections, on the other hand with naive implementation we will lose specific types and even worse we might be forced to change implementation and thus probably semantics. Consider this naive implementation of filter method:



        trait Iterable[A] {
        def filter(p: A => Boolean): ? = {
        var res: List[A] = Nil
        for (x <- this)
        if (p(x)) res = x :: res
        res
        }
        }


        What return type to put instead of the ?? If we put just Iterable[A] it means that we immediately lost the ability to use all the methods of the List that are not available in the Iterable (such as access by index). But another thing is what if our Iterable actually was a Set. We probably want the result of the filter to be a Set again instead of a List. This is the thing that the reference above article calls the "same-result-type principle". Scala is one of the few languages that allows you to design an elegant solution for this problem with minimal code duplication. The main idea is to have a companion object for each implementation (for example Iterable and immutable.List) and also a companion trait for each intermediate trait (for example SeqFactory. One of the crucial things those companions provide is an ability to create a new instance of the "same" collection (probably with a different generic type).



        And the last thing that is not covered in the referenced article is the CanBuildFrom implicit parameter. The logic behind it is following: assume you have a linked-list (List) and want to have an array-based list (Vector) zipped with index. Do you have to create an intermediate linked-list zipped with index in the process? CanBuildFrom is the trick that allows the answer to be: No, you don't have to. Implicit parameters is a fairly advanced feature of Scala and you probably should read more on it if you are not already familiar. The idea is that the compiler can automatically search for a parameter of matching type. So if there is any value matching the type is the scope, the code will compile. It means that the presence of a CanBuildFrom can be used as an evidence that you can change collection's underlying data structure at the same time as you do some data manipulation (zipWithIndex, map, filter, etc.). Each companion object provides default CanBuildFrom value with the same target data structure (for example see immutable.List.canBuildFrom).



        So now we can see how IterableLike.zipWithIndex, which is the default implementation for all subtypes, is implemented



          def zipWithIndex[A1 >: A, That](implicit bf: CanBuildFrom[Repr, (A1, Int), That]): That = {
        val b = bf(repr)
        var i = 0
        for (x <- this) {
        b += ((x, i))
        i += 1
        }
        b.result()
        }


        The signature says that using zipWithIndex you can convert your collection Repr of A into That - a collection of tuples (A1, Int) as long as





        1. A1 is a supertype of A (which is a safe upcast)

        2. You have an evidence that you can build That from Repr (the bf object)


        And what the method does is it asks the bf evidence to create b - a new Builder[(A1, Int), That], then fill the builder with tuples, and finally return the That collection from the builder (b.result()). Both builder and for with var i are used for performance reasons.



        How to implement your own minWithIndex



        The answer depends on how generic you want it to be and how much performance is important for you. Here are some options:



        implicit class MinWithIndexOps[A, Repr <: IterableLike[A, Repr]](it: IterableLike[A, Repr]) {
        def minWithIndexWithZip[B >: A](implicit cmp: Ordering[B], bf: CanBuildFrom[Repr, (A, Int), Iterable[(A, Int)]]): (A, Int) = it.zipWithIndex(bf).min(Ordering.by[(A, Int), B]((kv: (A, Int)) => kv._1))

        def minWithIndexWithFold[B >: A](implicit cmp: Ordering[B]): (A, Int) = {
        if (it.isEmpty)
        throw new UnsupportedOperationException("empty.min")

        val h = it.head
        val res = it.foldLeft((h, 0, 0))((acc, cur) => acc match {
        case (minVal, minIndex, curIndex) => if (cmp.lteq(minVal, cur)) (minVal, minIndex, curIndex + 1) else (cur, curIndex, curIndex + 1)
        })
        (res._1, res._2)
        }

        def minWithIndexWithVars[B >: A](implicit cmp: Ordering[B]): (A, Int) = {
        if (it.isEmpty)
        throw new UnsupportedOperationException("empty.min")

        var minVal = it.head
        var minIndex = 0
        var i = 0
        for (cur <- it) {
        if (cmp.gt(minVal, cur)) {
        minVal = cur
        minIndex = i
        }
        i += 1
        }
        (minVal, minIndex)
        }
        }


        def test(): Unit = {
        val data = "qwerty" :: "asdfg" :: "zxcvb" :: Nil

        println(data.minWithIndexWithZip)
        println(data.minWithIndexWithFold)
        println(data.minWithIndexWithVars)
        }


        Note that here implicit keyword is used in a different meaning to create an implicit class that effectively extends every instance of IterableLike with new methods.



        First implementation minWithIndexWithZip is very straightforward but probably very inefficient: you literally first do zipWithIndex and then call min (note also that the standard min uses an unusual for Scala convention of throwing exception instead of returning Option[A] and I used the same semantics in other implementations). This method is inefficient because you have to create whole new collection just to be disposed almost immediately.



        minWithIndexWithFold is a different implementation that does not rely on zipWithIndex and uses foldLeft instead. fold is at the same time a very basic and very generic operation. One benefit of this code is that it is also immutable. But because of that its performance is also not good: a lot of intermediate accumulator objects will be created and disposed almost immediately.



        The last minWithIndexWithVars is probably the least "pure" but the most performant version that uses straightforward imperative-like code.






        share|improve this answer


























          1












          1








          1







          Your question looks like actually two different questions:




          1. How Scala standard library collections are designed and particularly how zipWithIndex works?


          2. How to implement my custom minWithIndex?



          Unfortunately the first question is probably more complicated than the second one.



          Scala collections design



          As of version 2.12 current scala.collection is designed to be very flexible but it comes with a cost of using many advanced features and/or design patterns. Also one might argue that it is over-engineered for the real-world usages and thus it is hard to understand for a Scala novice.



          Designing collections library is a known hard problem because ideally you want to capture many different aspects. The main things is that you probably want to capture as many aspects as possible but at the same time has as little code duplication as possible. (For example consider linked-list vs array-based list. Both probably should have method like indexOf that can be implemented in terms of size and getByIndex so ideally you'd like to have only one implementation here.) This is a real design challenge.



          Here is a non-exhaustive list of aspects that affected Scala's collections design (you may also find some design notes on the earlier (2.8) version here):




          • Unified interfaces for different data structures implementing similar things (for example linked-list vs. array-based list are very similar while linked-list vs tree-based set are much less similar but still have something common). This is the reason for various traits such as Seq or GenTraversableLike and deep inheritance hierarchy.


          • Type-safety. We want a list of integers to be a different type from a list of strings. And we'd like to know the type of the stored elements. This is why all the collections are generic.


          • Performance. In every language standard collections are one of the most used pieces of code so good performance is very important. This is the reason why in several places there is an "impure" mutable implementation for a pure immutable FP interfaces.


          • Separate read-only and mutable collections (especially important in the FP-languages but in others as well). This is why there are scala.collection, scala.collection.mutable and scala.collection.immutable packages.


          • Support for potentially unlimited sequences such as generated (Fibonacci numbers) or coming from an external source (such as sequence of bytes read from console input). This is the reason for things like TraversableOnce or Iterable.


          • Support for sequences that can't be (easily) processed twice. For example a stream of all the deals on some major stock exchange is so huge that it can't be stored and re-processed. This is the reason for TraversableOnce vs Traversable


          • Internal vs external iteration (this is the reason of separation between Iterable vs Traversable)


          • Eager vs. lazy computations (this is the reason for "View"-suffixed types)


          • Support of the collection-like objects that are not collections. For example, you are implementing HTML-parser or even browser. Many HTML-elements can have children but being a collection of children is not the main responsibility. This is the reason for various traits with 'Gen` prefix.



          Now let's get back to zipWithIndex. This is one of many methods (other similar ones are map, fitler and many others) that pose additional challenges for the designers: those methods produce new collection from the current one. On the one hand such methods can be implemented generically one time for all collections, on the other hand with naive implementation we will lose specific types and even worse we might be forced to change implementation and thus probably semantics. Consider this naive implementation of filter method:



          trait Iterable[A] {
          def filter(p: A => Boolean): ? = {
          var res: List[A] = Nil
          for (x <- this)
          if (p(x)) res = x :: res
          res
          }
          }


          What return type to put instead of the ?? If we put just Iterable[A] it means that we immediately lost the ability to use all the methods of the List that are not available in the Iterable (such as access by index). But another thing is what if our Iterable actually was a Set. We probably want the result of the filter to be a Set again instead of a List. This is the thing that the reference above article calls the "same-result-type principle". Scala is one of the few languages that allows you to design an elegant solution for this problem with minimal code duplication. The main idea is to have a companion object for each implementation (for example Iterable and immutable.List) and also a companion trait for each intermediate trait (for example SeqFactory. One of the crucial things those companions provide is an ability to create a new instance of the "same" collection (probably with a different generic type).



          And the last thing that is not covered in the referenced article is the CanBuildFrom implicit parameter. The logic behind it is following: assume you have a linked-list (List) and want to have an array-based list (Vector) zipped with index. Do you have to create an intermediate linked-list zipped with index in the process? CanBuildFrom is the trick that allows the answer to be: No, you don't have to. Implicit parameters is a fairly advanced feature of Scala and you probably should read more on it if you are not already familiar. The idea is that the compiler can automatically search for a parameter of matching type. So if there is any value matching the type is the scope, the code will compile. It means that the presence of a CanBuildFrom can be used as an evidence that you can change collection's underlying data structure at the same time as you do some data manipulation (zipWithIndex, map, filter, etc.). Each companion object provides default CanBuildFrom value with the same target data structure (for example see immutable.List.canBuildFrom).



          So now we can see how IterableLike.zipWithIndex, which is the default implementation for all subtypes, is implemented



            def zipWithIndex[A1 >: A, That](implicit bf: CanBuildFrom[Repr, (A1, Int), That]): That = {
          val b = bf(repr)
          var i = 0
          for (x <- this) {
          b += ((x, i))
          i += 1
          }
          b.result()
          }


          The signature says that using zipWithIndex you can convert your collection Repr of A into That - a collection of tuples (A1, Int) as long as





          1. A1 is a supertype of A (which is a safe upcast)

          2. You have an evidence that you can build That from Repr (the bf object)


          And what the method does is it asks the bf evidence to create b - a new Builder[(A1, Int), That], then fill the builder with tuples, and finally return the That collection from the builder (b.result()). Both builder and for with var i are used for performance reasons.



          How to implement your own minWithIndex



          The answer depends on how generic you want it to be and how much performance is important for you. Here are some options:



          implicit class MinWithIndexOps[A, Repr <: IterableLike[A, Repr]](it: IterableLike[A, Repr]) {
          def minWithIndexWithZip[B >: A](implicit cmp: Ordering[B], bf: CanBuildFrom[Repr, (A, Int), Iterable[(A, Int)]]): (A, Int) = it.zipWithIndex(bf).min(Ordering.by[(A, Int), B]((kv: (A, Int)) => kv._1))

          def minWithIndexWithFold[B >: A](implicit cmp: Ordering[B]): (A, Int) = {
          if (it.isEmpty)
          throw new UnsupportedOperationException("empty.min")

          val h = it.head
          val res = it.foldLeft((h, 0, 0))((acc, cur) => acc match {
          case (minVal, minIndex, curIndex) => if (cmp.lteq(minVal, cur)) (minVal, minIndex, curIndex + 1) else (cur, curIndex, curIndex + 1)
          })
          (res._1, res._2)
          }

          def minWithIndexWithVars[B >: A](implicit cmp: Ordering[B]): (A, Int) = {
          if (it.isEmpty)
          throw new UnsupportedOperationException("empty.min")

          var minVal = it.head
          var minIndex = 0
          var i = 0
          for (cur <- it) {
          if (cmp.gt(minVal, cur)) {
          minVal = cur
          minIndex = i
          }
          i += 1
          }
          (minVal, minIndex)
          }
          }


          def test(): Unit = {
          val data = "qwerty" :: "asdfg" :: "zxcvb" :: Nil

          println(data.minWithIndexWithZip)
          println(data.minWithIndexWithFold)
          println(data.minWithIndexWithVars)
          }


          Note that here implicit keyword is used in a different meaning to create an implicit class that effectively extends every instance of IterableLike with new methods.



          First implementation minWithIndexWithZip is very straightforward but probably very inefficient: you literally first do zipWithIndex and then call min (note also that the standard min uses an unusual for Scala convention of throwing exception instead of returning Option[A] and I used the same semantics in other implementations). This method is inefficient because you have to create whole new collection just to be disposed almost immediately.



          minWithIndexWithFold is a different implementation that does not rely on zipWithIndex and uses foldLeft instead. fold is at the same time a very basic and very generic operation. One benefit of this code is that it is also immutable. But because of that its performance is also not good: a lot of intermediate accumulator objects will be created and disposed almost immediately.



          The last minWithIndexWithVars is probably the least "pure" but the most performant version that uses straightforward imperative-like code.






          share|improve this answer













          Your question looks like actually two different questions:




          1. How Scala standard library collections are designed and particularly how zipWithIndex works?


          2. How to implement my custom minWithIndex?



          Unfortunately the first question is probably more complicated than the second one.



          Scala collections design



          As of version 2.12 current scala.collection is designed to be very flexible but it comes with a cost of using many advanced features and/or design patterns. Also one might argue that it is over-engineered for the real-world usages and thus it is hard to understand for a Scala novice.



          Designing collections library is a known hard problem because ideally you want to capture many different aspects. The main things is that you probably want to capture as many aspects as possible but at the same time has as little code duplication as possible. (For example consider linked-list vs array-based list. Both probably should have method like indexOf that can be implemented in terms of size and getByIndex so ideally you'd like to have only one implementation here.) This is a real design challenge.



          Here is a non-exhaustive list of aspects that affected Scala's collections design (you may also find some design notes on the earlier (2.8) version here):




          • Unified interfaces for different data structures implementing similar things (for example linked-list vs. array-based list are very similar while linked-list vs tree-based set are much less similar but still have something common). This is the reason for various traits such as Seq or GenTraversableLike and deep inheritance hierarchy.


          • Type-safety. We want a list of integers to be a different type from a list of strings. And we'd like to know the type of the stored elements. This is why all the collections are generic.


          • Performance. In every language standard collections are one of the most used pieces of code so good performance is very important. This is the reason why in several places there is an "impure" mutable implementation for a pure immutable FP interfaces.


          • Separate read-only and mutable collections (especially important in the FP-languages but in others as well). This is why there are scala.collection, scala.collection.mutable and scala.collection.immutable packages.


          • Support for potentially unlimited sequences such as generated (Fibonacci numbers) or coming from an external source (such as sequence of bytes read from console input). This is the reason for things like TraversableOnce or Iterable.


          • Support for sequences that can't be (easily) processed twice. For example a stream of all the deals on some major stock exchange is so huge that it can't be stored and re-processed. This is the reason for TraversableOnce vs Traversable


          • Internal vs external iteration (this is the reason of separation between Iterable vs Traversable)


          • Eager vs. lazy computations (this is the reason for "View"-suffixed types)


          • Support of the collection-like objects that are not collections. For example, you are implementing HTML-parser or even browser. Many HTML-elements can have children but being a collection of children is not the main responsibility. This is the reason for various traits with 'Gen` prefix.



          Now let's get back to zipWithIndex. This is one of many methods (other similar ones are map, fitler and many others) that pose additional challenges for the designers: those methods produce new collection from the current one. On the one hand such methods can be implemented generically one time for all collections, on the other hand with naive implementation we will lose specific types and even worse we might be forced to change implementation and thus probably semantics. Consider this naive implementation of filter method:



          trait Iterable[A] {
          def filter(p: A => Boolean): ? = {
          var res: List[A] = Nil
          for (x <- this)
          if (p(x)) res = x :: res
          res
          }
          }


          What return type to put instead of the ?? If we put just Iterable[A] it means that we immediately lost the ability to use all the methods of the List that are not available in the Iterable (such as access by index). But another thing is what if our Iterable actually was a Set. We probably want the result of the filter to be a Set again instead of a List. This is the thing that the reference above article calls the "same-result-type principle". Scala is one of the few languages that allows you to design an elegant solution for this problem with minimal code duplication. The main idea is to have a companion object for each implementation (for example Iterable and immutable.List) and also a companion trait for each intermediate trait (for example SeqFactory. One of the crucial things those companions provide is an ability to create a new instance of the "same" collection (probably with a different generic type).



          And the last thing that is not covered in the referenced article is the CanBuildFrom implicit parameter. The logic behind it is following: assume you have a linked-list (List) and want to have an array-based list (Vector) zipped with index. Do you have to create an intermediate linked-list zipped with index in the process? CanBuildFrom is the trick that allows the answer to be: No, you don't have to. Implicit parameters is a fairly advanced feature of Scala and you probably should read more on it if you are not already familiar. The idea is that the compiler can automatically search for a parameter of matching type. So if there is any value matching the type is the scope, the code will compile. It means that the presence of a CanBuildFrom can be used as an evidence that you can change collection's underlying data structure at the same time as you do some data manipulation (zipWithIndex, map, filter, etc.). Each companion object provides default CanBuildFrom value with the same target data structure (for example see immutable.List.canBuildFrom).



          So now we can see how IterableLike.zipWithIndex, which is the default implementation for all subtypes, is implemented



            def zipWithIndex[A1 >: A, That](implicit bf: CanBuildFrom[Repr, (A1, Int), That]): That = {
          val b = bf(repr)
          var i = 0
          for (x <- this) {
          b += ((x, i))
          i += 1
          }
          b.result()
          }


          The signature says that using zipWithIndex you can convert your collection Repr of A into That - a collection of tuples (A1, Int) as long as





          1. A1 is a supertype of A (which is a safe upcast)

          2. You have an evidence that you can build That from Repr (the bf object)


          And what the method does is it asks the bf evidence to create b - a new Builder[(A1, Int), That], then fill the builder with tuples, and finally return the That collection from the builder (b.result()). Both builder and for with var i are used for performance reasons.



          How to implement your own minWithIndex



          The answer depends on how generic you want it to be and how much performance is important for you. Here are some options:



          implicit class MinWithIndexOps[A, Repr <: IterableLike[A, Repr]](it: IterableLike[A, Repr]) {
          def minWithIndexWithZip[B >: A](implicit cmp: Ordering[B], bf: CanBuildFrom[Repr, (A, Int), Iterable[(A, Int)]]): (A, Int) = it.zipWithIndex(bf).min(Ordering.by[(A, Int), B]((kv: (A, Int)) => kv._1))

          def minWithIndexWithFold[B >: A](implicit cmp: Ordering[B]): (A, Int) = {
          if (it.isEmpty)
          throw new UnsupportedOperationException("empty.min")

          val h = it.head
          val res = it.foldLeft((h, 0, 0))((acc, cur) => acc match {
          case (minVal, minIndex, curIndex) => if (cmp.lteq(minVal, cur)) (minVal, minIndex, curIndex + 1) else (cur, curIndex, curIndex + 1)
          })
          (res._1, res._2)
          }

          def minWithIndexWithVars[B >: A](implicit cmp: Ordering[B]): (A, Int) = {
          if (it.isEmpty)
          throw new UnsupportedOperationException("empty.min")

          var minVal = it.head
          var minIndex = 0
          var i = 0
          for (cur <- it) {
          if (cmp.gt(minVal, cur)) {
          minVal = cur
          minIndex = i
          }
          i += 1
          }
          (minVal, minIndex)
          }
          }


          def test(): Unit = {
          val data = "qwerty" :: "asdfg" :: "zxcvb" :: Nil

          println(data.minWithIndexWithZip)
          println(data.minWithIndexWithFold)
          println(data.minWithIndexWithVars)
          }


          Note that here implicit keyword is used in a different meaning to create an implicit class that effectively extends every instance of IterableLike with new methods.



          First implementation minWithIndexWithZip is very straightforward but probably very inefficient: you literally first do zipWithIndex and then call min (note also that the standard min uses an unusual for Scala convention of throwing exception instead of returning Option[A] and I used the same semantics in other implementations). This method is inefficient because you have to create whole new collection just to be disposed almost immediately.



          minWithIndexWithFold is a different implementation that does not rely on zipWithIndex and uses foldLeft instead. fold is at the same time a very basic and very generic operation. One benefit of this code is that it is also immutable. But because of that its performance is also not good: a lot of intermediate accumulator objects will be created and disposed almost immediately.



          The last minWithIndexWithVars is probably the least "pure" but the most performant version that uses straightforward imperative-like code.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 23 '18 at 3:17









          SergGrSergGr

          21.2k22243




          21.2k22243

























              0














              I have seen scala using mutable solutions sometimes internally. Same with their implementation of zipWithIndex.



              It is basically :




              • iterating over the collection (for loop)

              • for each element creating a tuple (a, Int) and

              • appending them to mutable mutable buffer (b += ((x, i)))


              So, for input List("first", "second"), you get List(("first", 1), ("second", 2)) after zipWithIndex.



              scala> List("first", "second").zipWithIndex
              res3: List[(String, Int)] = List((first,0), (second,1))


              You can solve the same using recursive approach with immutability,



              object MyZippedCollection {

              def main(args: Array[String]): Unit = {

              def zipWithIndex[a](list: List[a]): List[(a, Int)] = {

              def zipWith(index: Int, acc: List[(a, Int)], list: List[a]): List[(a, Int)] = {
              if (list.isEmpty) acc
              else zipWith(index + 1, acc :+ (list.head, index), list.tail)
              }

              zipWith(0, List.empty[(a, Int)], list)
              }

              val myList = List("scala", "fp", "haskell")

              zipWithIndex(myList).foreach { case (value, index) =>
              println(s"[$index]: $value")
              }
              }
              }





              share|improve this answer






























                0














                I have seen scala using mutable solutions sometimes internally. Same with their implementation of zipWithIndex.



                It is basically :




                • iterating over the collection (for loop)

                • for each element creating a tuple (a, Int) and

                • appending them to mutable mutable buffer (b += ((x, i)))


                So, for input List("first", "second"), you get List(("first", 1), ("second", 2)) after zipWithIndex.



                scala> List("first", "second").zipWithIndex
                res3: List[(String, Int)] = List((first,0), (second,1))


                You can solve the same using recursive approach with immutability,



                object MyZippedCollection {

                def main(args: Array[String]): Unit = {

                def zipWithIndex[a](list: List[a]): List[(a, Int)] = {

                def zipWith(index: Int, acc: List[(a, Int)], list: List[a]): List[(a, Int)] = {
                if (list.isEmpty) acc
                else zipWith(index + 1, acc :+ (list.head, index), list.tail)
                }

                zipWith(0, List.empty[(a, Int)], list)
                }

                val myList = List("scala", "fp", "haskell")

                zipWithIndex(myList).foreach { case (value, index) =>
                println(s"[$index]: $value")
                }
                }
                }





                share|improve this answer




























                  0












                  0








                  0







                  I have seen scala using mutable solutions sometimes internally. Same with their implementation of zipWithIndex.



                  It is basically :




                  • iterating over the collection (for loop)

                  • for each element creating a tuple (a, Int) and

                  • appending them to mutable mutable buffer (b += ((x, i)))


                  So, for input List("first", "second"), you get List(("first", 1), ("second", 2)) after zipWithIndex.



                  scala> List("first", "second").zipWithIndex
                  res3: List[(String, Int)] = List((first,0), (second,1))


                  You can solve the same using recursive approach with immutability,



                  object MyZippedCollection {

                  def main(args: Array[String]): Unit = {

                  def zipWithIndex[a](list: List[a]): List[(a, Int)] = {

                  def zipWith(index: Int, acc: List[(a, Int)], list: List[a]): List[(a, Int)] = {
                  if (list.isEmpty) acc
                  else zipWith(index + 1, acc :+ (list.head, index), list.tail)
                  }

                  zipWith(0, List.empty[(a, Int)], list)
                  }

                  val myList = List("scala", "fp", "haskell")

                  zipWithIndex(myList).foreach { case (value, index) =>
                  println(s"[$index]: $value")
                  }
                  }
                  }





                  share|improve this answer















                  I have seen scala using mutable solutions sometimes internally. Same with their implementation of zipWithIndex.



                  It is basically :




                  • iterating over the collection (for loop)

                  • for each element creating a tuple (a, Int) and

                  • appending them to mutable mutable buffer (b += ((x, i)))


                  So, for input List("first", "second"), you get List(("first", 1), ("second", 2)) after zipWithIndex.



                  scala> List("first", "second").zipWithIndex
                  res3: List[(String, Int)] = List((first,0), (second,1))


                  You can solve the same using recursive approach with immutability,



                  object MyZippedCollection {

                  def main(args: Array[String]): Unit = {

                  def zipWithIndex[a](list: List[a]): List[(a, Int)] = {

                  def zipWith(index: Int, acc: List[(a, Int)], list: List[a]): List[(a, Int)] = {
                  if (list.isEmpty) acc
                  else zipWith(index + 1, acc :+ (list.head, index), list.tail)
                  }

                  zipWith(0, List.empty[(a, Int)], list)
                  }

                  val myList = List("scala", "fp", "haskell")

                  zipWithIndex(myList).foreach { case (value, index) =>
                  println(s"[$index]: $value")
                  }
                  }
                  }






                  share|improve this answer














                  share|improve this answer



                  share|improve this answer








                  edited Nov 23 '18 at 2:43

























                  answered Nov 22 '18 at 22:40









                  prayagupdprayagupd

                  19.9k890139




                  19.9k890139






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Stack Overflow!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53438631%2fscala-source-code-how-to-understand-zipwithindex%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      To store a contact into the json file from server.js file using a class in NodeJS

                      Redirect URL with Chrome Remote Debugging Android Devices

                      Dieringhausen