score:2
Your moveGraph
function should look like this:
def moveGraph(f: Double => Double, x: Double, y: Double): Double => Double =
p => f(p - x) + y
If you want to use the passed in parameter, you have to name it.
The compiler doesn't understand what _
should stand for. Here is a list of all possible usages for _
.
score:1
A fancier way to do it in scala is using function combinators:
def moveGraph(f: Double => Double, x: Double, y: Double): Double => Double =
{p:Double => p - x } andThen f andThen y.+
((-x).+ andThen f andThen y.+
would be even nicer, but it doesn't' work in this case unfortunately, because +
is overloaded, and scala gets confused by it).
score:2
Just adding to Mifeet answer, it is not like you have to name parameters. Just the shorthand notation for anonymous functions is quite simple and just expands in its current scope, so
f(_ - x) + y
becomes
f(param => param - x) + y
This is what compiler did, and complained that it doesn't know the type.
If f
would expect a function Int => Int
it would be a valid usage.
But you want to do something else here and you need to be explicit about your param
param => f(param - x) + y
Source: stackoverflow.com
Related Query
- Moving a graph of mathematical function
- How do I implement a generic mathematical function in Scala
- Discarding first few values while calculating moving average using Spark window function
- applying a function to graph data using mapReduceTriplets in spark and graphx
- Task not serializable when moving simple function out of UDF
- Spark Scala - Moving Average aligning result with pandas function
- What is the apply function in Scala?
- Difference between method and function in Scala
- Task not serializable: java.io.NotSerializableException when calling function outside closure only on classes not objects
- What is the difference between "def" and "val" to define a function
- What is the Scala annotation to ensure a tail recursive function is optimized?
- Defining a function with multiple implicit arguments in Scala
- Differences between these three ways of defining a function in Scala
- Scala underscore - ERROR: missing parameter type for expanded function
- Decomposing tuples in function arguments
- aggregate function Count usage with groupBy in Spark
- Calculating the Moving Average of a List
- How to apply a function to a tuple?
- Empty partial function in Scala
- Scala: How to define "generic" function parameters?
- What is a function literal in Scala?
- Function syntax puzzler in scalaz
- Why is PartialFunction <: Function in Scala?
- Run a function periodically in Scala
- Anonymous recursive function in Scala
- How to use / refer to the negation of a boolean function in Scala?
- How do you define a type for a function in Scala?
- The argument types of an anonymous function must be fully known. (SLS 8.5)
- Calling Java/Scala function from a task
- Aggregating multiple columns with custom function in Spark
More Query from same tag
- barrier for multiple nodes
- Efficient nearest neighbour search in Scala
- scala for each loop got error when convert java to scala
- How to Iterate Dataset column of dense rank to create Array of another column in Scala?
- Why is Typesafe activator command `activator dependencies` not working?
- Iterate over imported scala classes
- Update case class data in an array
- How do I convert an ApiFuture to a ZIO Task?
- Build sparse matrix scala spark
- python dataframe or hiveSql update based on predecessor value?
- Implement implicit Writes for result of join query
- Ignore forAll assertions for endpoints tested in details
- Communicate between http4s backend and Binding.scala frontend
- How to tackle skewness and output file size in Apache Spark
- Creating a mapped column type for an Abstract class Scala Slick
- Join Multiple Data frames in Spark
- Primitive types to AnyRef in Scala
- Cannot resolve symbol NotNull with Slick 3.1.1
- How do I reduce a spark dataframe from kafka and collect the result?
- How to remove key but keep childs
- Having Slick use the entities defined in my tables to fill my SQL queries responses
- Code-formatting: How to align <- inside for-comprehension?
- How to use orderby() with descending order in Spark window functions?
- How can I set alert of spark streaming job if no records are being pushed in last 1 hour?
- Can't understand type errors in Scala
- Family Polymorphism + Mixins?
- Constructor and preStart/postStop hooks NOT invoked for injected Akka RouterActor
- how to get all instances of a given class/trait with scala reflect? all refs to a given instance?
- Scala JSON Parsing Nested Array into Case Class
- Scala spark aggregate a set of columns in a data frame to a JSON string