Skip to main content
The 2024 Developer Survey results are live! See the results

Questions tagged [scala]

Scala is a general-purpose programming language principally targeting the Java Virtual Machine. Designed to express common programming patterns in a concise, elegant, and type-safe way, it fuses both imperative and functional programming styles. Its key features are: an advanced static type system with type inference; function types; pattern-matching; implicit parameters and conversions; operator overloading; full interoperability with Java; concurrency

0 votes
0 answers
7 views

Azure Databricks Hadoop Streaming Error for Read From Apache Iceberg

We are building out a data lakehouse and upgrading our databricks runtime from 12.2 LTS to 14.3 LTS to support python 3.10. We are able to write into our iceberg tables, but reading those tables ...
Daniel Brenner's user avatar
0 votes
1 answer
22 views

I keep getting Unused expression without side effects when adding more than one assertion

So I used to be able to run multiple assertions however now I keep getting the warning "Unused expression without side effects". Every assertion on its own however passes the test just fine. ...
clive_'s user avatar
  • 1
-4 votes
0 answers
21 views

Write filter on DataFrame in Spark Scala on mulitple different columns [closed]

I have 3 columns in my data frame that I want to run my filter on. Filter conditions: dataframe.filter(col(ID) =!= X) || col(y) =!= null || col (y) =!= col(z)) Requirement is : To exclude data from ...
MrWayne's user avatar
  • 331
0 votes
1 answer
20 views

How to filter dataframe by column from different dataframe?

I want to filter dataframe by column with Strings from different dataframe. val booksReadBestAuthors = userReviewsWithBooksDetails.filter(col("authors").isin(userReviewsAuthorsManyRead:_*)) ...
Joanna Kois's user avatar
1 vote
0 answers
20 views

http4s request with multipart/form-data sending empty content

I am attempting to make a request using http4s client to send a file payload which is Array[Byte] as a multipart form request. There doesn't seem to be any mention of this functionality in the docs. I ...
Mukhayyo Tashpulatova's user avatar
1 vote
0 answers
37 views

How to force a match type to be fully reduced on compile?

I don't get why a match type is not always reduced. Match types should help to eliminate illegal arguments on compilation. But as it turns out it does not always reduce in full. Take this silly ...
user3508638's user avatar
0 votes
0 answers
28 views

Scala Matchers for compare Options with Array

By using org.scalatest.matchers.should.Matchers I'm trying to compare values with type Option[Array[String]] But method "Matchers.should equals" returns false, when I compare val arr1 = ...
Jelly's user avatar
  • 1,192
0 votes
0 answers
14 views

json4s case class with fields as json string to json string representation

I have a case class with two fields like the following: case class Data { a: String b: String } The values of both a and b are complex JSON strings. When I am trying to do compact(render(data)) I ...
Sadhna Jain's user avatar
0 votes
3 answers
55 views

Collect with inline function

f(x) = x+2 p(y) = if(y>2) true else false List(4,7,1,3,9).collect{ case I if p(f(i)) => f(i)} This works - but if f(x) was a heavy compute function, it would mean I am forced to call it ...
IUnknown's user avatar
  • 9,647
1 vote
0 answers
33 views

How can I get the class of a companion object as a constant?

I'm working with a Scala 2 project. I have the following code: package com.myproject.commands; object MyCommand { def execute(args: Map[String, String]) = { //do some validations... ...
Luiggi Mendoza's user avatar
0 votes
1 answer
37 views

Scala 3 polymorphic extension method not working well with literal types

In my project, I'm using Scala 3 and I think I've found a weird case where an polymorphic extension method is not working well with custom literal types. I've reduced down the code so that I can show ...
Jake's user avatar
  • 1,538
0 votes
1 answer
31 views

What is the maximum number of entries a array can in a Spark column can hold?

I've created a struct with the data of some columns combined. Large numbers of these structs now occur for my unique identifier values. I want to combine these structs into an array using collect_list....
M.S.Visser's user avatar
0 votes
0 answers
32 views

Input that .repeat() takes in Gatling simulation

I have a scenario where for array size I need to repeat some code piece in Gatling. While .repat() works with integer provided for items.size.toInt , it does not honour the provided value and repeat ...
Ankit2201's user avatar
0 votes
1 answer
49 views

Spark-Scala vs Pyspark Dag is different?

I am converting pyspark job to Scala and jobs executes in emr. The parameter and data and code is same. However I see the run time is different and so also the dag getting created is different. Here I ...
user3858193's user avatar
  • 1,438
0 votes
1 answer
54 views

Output is not giving as Range(1,2,3,4) for val from1Until5 = 1 until 5 println(s"Range from 1 until 5 where 5 is excluded = $from1Until5")

I am executing println("\nStep 2: Create a numeric range from 1 to 5 but excluding the last integer number 5") val from1Until5 = 1 until 5 println(s"Range from 1 until 5 where 5 is ...
bigdata spark's user avatar

15 30 50 per page
1
2 3 4 5
7509