scala


java - Why does "split" on an empty string return a non-empty array?

If you split an orange zero times, you have exactly one piece - the orange.... Read More


Scala - printing arrays

In Scala 2.8, you can use the deep method defined on Array, that returns an IndexedSeq cointaining all of the (possibly nested) elements of this array, and call mkString on that: scala> val array = A... Read More


scala - How do I skip a header from CSV files in Spark?

data = sc.textFile('path_to_data') header = data.first() #extract header data = data.filter(row => row != header) #filter out header... Read More


java - Restart elasticsearch node

The correct way to restart a node is to shut it down, using either the shutdown API or sending a TERM signal to the process (eg with kill $PID). Once shut down, you can start a new node using whateve... Read More


arrays - Appending an element to the end of a list in Scala

List(1,2,3) :+ 4 Results in List[Int] = List(1, 2, 3, 4) Note that this operation has a complexity of O(n). If you need this operation frequently, or for long lists, consider using another data typ... Read More


Why would I use Scala/Lift over Java/Spring?

I've gotta say that I strongly disagree with Dan LaRocque's answer. Lift is not monolithic. It is composed on discrete elements. It does not ignore J/EE elements, it support the likes of JNDI, JTA, J... Read More


scala - Go to next compiler error across project in IntelliJ

To navigate between errors or warnings in IntelliJ you can do one of the following: Use keyboard shortcuts F2 (Next) and Shift +F2 (Previous) respectively. On the main menu, choose Navigate | Next /... Read More


How to convert a scala.List to a java.util.List?

Not sure why this hasn't been mentioned before but I think the most intuitive way is to invoke the asJava decorator method of JavaConverters directly on the Scala list: scala> val scalaList = List(1,... Read More


scala - Enforce type difference

I have a simpler solution, which also leverages ambiguity, trait =!=[A, B] implicit def neq[A, B] : A =!= B = null // This pair excludes the A =:= B case implicit def neqAmbig1[A] : A =!= A = null... Read More


scala - How to define partitioning of DataFrame?

Spark >= 2.3.0 SPARK-22614 exposes range partitioning. val partitionedByRange = df.repartitionByRange(42, $"k") partitionedByRange.explain // == Parsed Logical Plan == // 'RepartitionByExpression ['... Read More