score:5
basic approach
if all the methods in the interface of the builder (except maybe build
itself) just mutate the builder instance and return this
, then they can be abstracted as builder => unit
functions. this is true for nettychannelbuilder
, if i'm not mistaken. what you want to do in this case is to combine a bunch of those builder => unit
into a single builder => unit
, which runs the original ones consecutively.
here is a direct implementation of this idea for nettychannelbuilder
:
object builder {
type input = nettychannelbuilder
type output = managedchannel
case class op(run: input => unit) {
def and(next: op): op = op { in =>
this.run(in)
next.run(in)
}
def runon(in: input): output = {
run(in)
in.build()
}
}
// combine several ops into one
def combine(ops: op*): op = op(in => ops.foreach(_.run(in)))
// wrap methods from the builder interface
val addtransportsecurity: op = op(_.usetransportsecurity())
def addsslcontext(sslcontext: sslcontext): op = op(_.sslcontext(sslcontext))
}
and you can use it like this:
val builderpipeline: builder.op =
builder.addtransportsecurity and
builder.addsslcontext(???)
builderpipeline runon nettychannelbuilder.foraddress("localhost", 80)
reader monad
it's also possible to use the reader monad here. reader monad allows combining two functions context => a
and a => context => b
into context => b
. of course every function you want to combine here is just context => unit
, where the context
is nettychannelbuilder
. but the build
method is nettychannelbuilder => managedchannel
, and we can add it into the pipeline with this approach.
here is an implementation without any third-party libraries:
object monadicbuilder {
type context = nettychannelbuilder
case class op[result](run: context => result) {
def map[final](f: result => final): op[final] =
op { ctx =>
f(run(ctx))
}
def flatmap[final](f: result => op[final]): op[final] =
op { ctx =>
f(run(ctx)).run(ctx)
}
}
val addtransportsecurity: op[unit] = op(_.usetransportsecurity())
def addsslcontext(sslcontext: sslcontext): op[unit] = op(_.sslcontext(sslcontext))
val build: op[managedchannel] = op(_.build())
}
it's convenient to use it with the for-comprehension syntax:
val pipeline = for {
_ <- monadicbuilder.addtransportsecurity
sslcontext = ???
_ <- monadicbuilder.addsslcontext(sslcontext)
result <- monadicbuilder.build
} yield result
val channel = pipeline run nettychannelbuilder.foraddress("localhost", 80)
this approach can be useful in more complex scenarios, when some of the methods return other variables, which should be used in later steps. but for nettychannelbuilder
where most functions are just context => unit
, it only adds unnecessary boilerplate in my opinion.
as for other monads, the main purpose of state is to track changes to a reference to an object, and it's useful because that object is normally immutable. for a mutable object reader works just fine.
free monad is used in similar scenarios as well, but it adds much more boilerplate, and its usual usage scenario is when you want to build an abstract syntax tree object with some actions/commands and then execute it with different interpreters.
generic builder
it's quite simple to adapt the previous two approaches to support any builder or mutable class in general. though without creating separate wrappers for mutating methods, the boilerplate for using it grows quite a bit. for example, with the monadic builder approach:
class genericbuilder[context] {
case class op[result](run: context => result) {
def map[final](f: result => final): op[final] =
op { ctx =>
f(run(ctx))
}
def flatmap[final](f: result => op[final]): op[final] =
op { ctx =>
f(run(ctx)).run(ctx)
}
}
def apply[result](run: context => result) = op(run)
def result: op[context] = op(identity)
}
using it:
class person {
var name: string = _
var age: int = _
var jobexperience: int = _
def getyearsasanadult: int = (age - 18) max 0
override def tostring = s"person($name, $age, $jobexperience)"
}
val build = new genericbuilder[person]
val builder = for {
_ <- build(_.name = "john")
_ <- build(_.age = 36)
adultfor <- build(_.getyearsasanadult)
_ <- build(_.jobexperience = adultfor)
result <- build.result
} yield result
// prints: person(john, 36, 18)
println(builder.run(new person))
score:0
a very simple functional approach is having a case class that collects the configuration, and has methods that update their values and pass it along so it can be built at the end:
case class mynettychannel( ip: string, port: int,
transportsecurity: boolean,
sslcontext: option[sslcontext] ) {
def foraddress(addrip: string, addrport: int) = copy(ip = addrip, port = addrport)
def withtransportsecurity = copy(transportsecurity = true)
def withouttransportsecurity = copy(transportsecurity = false)
def withsslcontext(ctx: sslcontext) = copy(sslcontext = some(ctx))
def build: nettychannel = {
/* create the actual instance using the existing builder */
}
}
object mynettychannel {
val default = mynettychannel("127.0.0.1", 80, false, none)
}
val nettychannel = mynettychannel.default
.foraddress(hostip, hostport)
.withtransportsecurity
.withsslcontext(ctx)
.build
a similar approach (without having to create the copying methods in the first place) is to use lenses, for example using the quicklens library:
val nettychannel = mynettychannel.default
.modify(_.ip) .setto(hostip)
.modify(_.port) .setto(1234)
.modify(_.transportsecurity).setto(true)
.modify(_.sslcontext) .setto(ctx)
.build
score:2
i know that we said no cats et al.
but i decided to post this up, first, in all honesty as an exercise for myself and second, since in essence these libraries simply aggregate "common" typed functional constructs and patterns.
after all, would you ever consider writing an http server from vanilla java/scala or would you grab a battle tested one off the shelf? (sorry for the evangelism)
regardless, you could replace their heavyweight implementation with a homegrown one of your own, if you really wanted.
i will present below, two schemes that came to mind, the first using the reader
monad, the second using the state
monad. i personally find the first approach a bit more clunky than the second, but they are both not too pretty on the eye. i guess that a more experienced practitioner could do a better job at it than i.
before that, i find the following rather interesting: semicolons vs monads
the code:
i defined the java bean:
public class bean {
private int x;
private string y;
public bean(int x, string y) {
this.x = x;
this.y = y;
}
@override
public string tostring() {
return "bean{" +
"x=" + x +
", y='" + y + '\'' +
'}';
}
}
and the builder:
public final class beanbuilder {
private int x;
private string y;
private beanbuilder() {
}
public static beanbuilder abean() {
return new beanbuilder();
}
public beanbuilder withx(int x) {
this.x = x;
return this;
}
public beanbuilder withy(string y) {
this.y = y;
return this;
}
public bean build() {
return new bean(x, y);
}
}
now for the scala code:
import cats.id
import cats.data.{reader, state}
object boot extends app {
val r: reader[unit, bean] = for {
i <- reader({ _: unit => beanbuilder.abean() })
n <- reader({ _: unit => i.withx(12) })
b <- reader({ _: unit => n.build() })
_ <- reader({ _: unit => println(b) })
} yield b
private val run: unit => id[bean] = r.run
println("will come before the value of the bean")
run()
val state: state[beanbuilder, bean] = for {
_ <- state[beanbuilder, beanbuilder]({ b: beanbuilder => (b, b.withx(13)) })
_ <- state[beanbuilder, beanbuilder]({ b: beanbuilder => (b, b.withy("look at me")) })
bean <- state[beanbuilder, bean]({ b: beanbuilder => (b, b.build()) })
_ <- state.pure(println(bean))
} yield bean
println("will also come before the value of the bean")
state.runa(beanbuilder.abean()).value
}
the output, due to the lazy nature of the evaluation of these monads is:
will come before the value of the bean
bean{x=12, y='null'}
will also come before the value of the bean
bean{x=13, y='look at me'}
Source: stackoverflow.com
Related Query
- How To Convert the Builder Pattern to a Functional Implementation?
- How to avoid duplications of mixing of an implementation with the cake pattern
- How is pattern matching in Scala implemented at the bytecode level?
- How do I apply the enrich-my-library pattern to Scala collections?
- How are the multiple Actors implementation in Scala different?
- How do you do dependency injection with the Cake pattern without hardcoding?
- How to convert a Seq[A] to a Map[Int, A] using a value of A as the key in the map?
- How is the POJO/JavaBean pattern treated in Scala?
- How to return all positives and the first negative number in a list using functional programming?
- How can I combine the typeclass pattern with subtyping?
- How to elegantly implement the pipeline pattern using Scala
- How do purely functional compilers annotate the AST with type info?
- How to use mocks with the Cake Pattern
- Scala Play - How to convert a list of Scala Strings into an Array of javascript Strings (avoiding the " issue)?
- Functional programming: How to carry on the context for a chain of validation rules
- How to convert UNIX timestamp to the given timezone in Spark?
- How to change the functional insert-sort code to be tail recursive
- How exactly does the Scala implementation of <:<, <%<, =:= work in the compiler?
- Flink: How to convert the deprecated fold to aggregrate?
- How to pattern match even number in the case in scala?
- How to convert the Scala case class definition to Haskell?
- How does HList.foldRight look for implicits when used in the implementation of a type class?
- How to convert the group by function to data frame
- How does Scala Cons pattern matching determine the head and the tail of a List?
- How can I convert scala.xml.Elem to something compatible with the javax.xml APIs?
- How do I convert the following var Scala code into vals?
- How Scala Array apply method returns the value at the index when the implementation is simply throw new error
- How to get manifest in the pattern matching
- How can I visualize this recursive implementation of finding the maximum number of a list?
- How to convert a spark DataFrame with a Decimal to a Dataset with a BigDecimal of the same precision?
More Query from same tag
- How should I pass a computed object from the primary class constructor to the supertype constructor?
- Avoid specifying schema twice (Spark/scala)
- Is it possible to resume a failed Apache Spark job?
- How to replace or append an item in/to a list?
- How to select fields from SetColumn[String] in Phantom
- Too many possible configurations in z3 using scala^z3
- Writting generic function for withcolumn in spark scala
- Scala: Why are Actors lightweight?
- Creating a column if a condition is proven to be true
- Returning List[Double], Map[String, Double] from a list of Doubles
- Left join on two DataFrames giving error cannot be applied to (org.apache.spark.sql.Dataset, org.apache.spark.sql.Column, String)
- scala com.datastax.driver.core.Row.getList usage
- value % is not a member of type parameter T (Scala)
- Flink job cant use savepoint in a batch job
- SSH using scala with username and password
- How to Create a Timer Actor that receives
- how to replace distinct() with reducebykey
- Akka Stream: Can not write to file sink
- Why can't I reuse "unapply" without repeating the method signature
- How add the name of folders to each row in spark
- sbt eclipse error evaluating task scalaoptions error
- How to configure Play for test environment
- Scala with spark - "javax.servlet.ServletRegistration"'s signer information does not match signer information of other classes in the same package
- Spark's Column.isin function does not take List
- Removing characters from List of Strings
- Scala: How to test the concurrency of a mutable.Set
- case class implementation in spark
- Scala + ZMQ = Operation cannot be accomplished in current state
- How to swap JSON Writes Converter for Play controller Action
- How Does One Make Scala Control Abstraction in Repeat Until?