score:2
You can get something similar, but simpler, like so.
import sys.process._
import util.Try
class StdInReader(val reader :String) {
def send(input :String) :Try[String] =
Try(s"/bin/echo $input".#|(reader).!!.trim)
}
usage:
val bc = new StdInReader("/usr/bin/bc")
bc.send("2 * 8") //res0: scala.util.Try[String] = Success(16)
bc.send("12 + 8") //res1: scala.util.Try[String] = Success(20)
bc.send("22 - 8") //res2: scala.util.Try[String] = Success(14)
Programs that send a non-zero exit-code (bc
doesn't) will result with a Failure()
.
If you need more fine-grained control you might start with something like this and expand on it.
import sys.process._
class ProcHandler(val cmnd :String) {
private val resbuf = collection.mutable.Buffer.empty[String]
def run(data :Seq[String]) :Unit = {
cmnd.run(new ProcessIO(
in => {
val writer = new java.io.PrintWriter(in)
data.foreach(writer.println)
writer.close()
},
out => {
val src = io.Source.fromInputStream(out)
src.getLines().foreach(resbuf += _)
src.close()
},
_.close() //maybe create separate buffer for stderr?
)).exitValue()
}
def results() :Seq[String] = {
val rs = collection.mutable.Buffer.empty[String]
resbuf.copyToBuffer(rs)
resbuf.clear()
rs
}
}
usage:
val bc = new ProcHandler("/usr/bin/bc")
bc.run(List("4+5","6-2","2*5"))
bc.run(List("99/3","11*77"))
bc.results() //res0: Seq[String] = ArrayBuffer(9, 4, 10, 33, 847)
OK, I did some more research and found this. It appears to get at what you want but there are limitations. In particular, the process stays open for input until you want to get output. At that point IO streams are closed to insure all buffers are flushed.
import sys.process._
import util.Try
class ProcHandler(val cmnd :String) {
private val procInput = new java.io.PipedOutputStream()
private val procOutput = new java.io.PipedInputStream()
private val proc = cmnd.run( new ProcessIO(
{ in => // attach to the process's internal input stream
val istream = new java.io.PipedInputStream(procInput)
val buf = Array.fill(100)(0.toByte)
Iterator.iterate(istream.read(buf)){ br =>
in.write(buf, 0, br)
istream.read(buf)
}.takeWhile(_>=0).toList
in.close()
},
{ out => // attach to the process's internal output stream
val ostream = new java.io.PipedOutputStream(procOutput)
val buf = Array.fill(100)(0.toByte)
Iterator.iterate(out.read(buf)){ br =>
ostream.write(buf, 0, br)
out.read(buf)
}.takeWhile(_>=0).toList
out.close()
},
_ => () // ignore stderr
))
private val procO = new java.io.BufferedReader(new java.io.InputStreamReader(procOutput))
private val procI = new java.io.PrintWriter(procInput, true)
def feed(str :String) :Unit = procI.println(str)
def feed(ss :Seq[String]) :Unit = ss.foreach(procI.println)
def read() :List[String] = {
procI.close() //close input before reading output
val lines = Stream.iterate(Try(procO.readLine)){_ =>
Try(procO.readLine)
}.takeWhile(_.isSuccess).map(_.get).toList
procO.close()
lines
}
}
usage:
val bc = new ProcHandler("/usr/bin/bc")
bc.feed(List("9*3","4+11")) //res0: Unit = ()
bc.feed("4*13") //res1: Unit = ()
bc.read() //res2: List[String] = List(27, 15, 52)
bc.read() //res3: List[String] = List()
OK, this is my final word on the subject. I think this ticks every item on your wish list: start the process only once, it stays alive until actively closed, allows alternating the writing and reading.
import sys.process._
class ProcHandler(val cmnd :Seq[String]) {
private var os: java.io.OutputStream = null
private var is: java.io.InputStream = null
private val pio = new ProcessIO(os = _, is = _, _.close())
private val proc = cmnd.run(pio)
def feed(ss :String*) :Unit = {
ss.foreach(_.foreach(os.write(_)))
os.flush()
}
def ready :Boolean = is.available() > 0
def read() :String = {
Seq.fill[Char](is.available())(is.read().toChar).mkString
}
def close() :Unit = {
proc.exitValue()
os.close()
is.close()
}
}
There are still issues and much room for improvement. IO is handled at a basic level (streams) and I'm not sure what I'm doing here is completely safe and correct. The input, feed()
, is required to supply the necessary NewLine terminations, and the output, read()
, is just a raw String
and not separated into a nice collection of string results.
Note that this will bleed system resources if the client code fails to close()
all processes.
Note also that reading doesn't wait for content (i.e. no blocking). After writing the response might not be immediately available.
usage:
val bc = new ProcHandler(Seq("/usr/bin/bc","-q"))
bc.feed("44-21\n", "21*4\n")
bc.feed("67+11\n")
if (bc.ready) bc.read() else "not ready" // "23\n84\n78\n"
bc.feed("67-11\n")
if (bc.ready) bc.read() else "not ready" // "56\n"
bc.feed("67*11\n", "1+2\n")
if (bc.ready) bc.read() else "not ready" // "737\n3\n"
if (bc.ready) bc.read() else "not ready" // "not ready"
bc.close()
Source: stackoverflow.com
Related Query
- Process Interaction through stdin/stdout
- How can I process a file uploaded through an HTML form in Spray/Scala/Java?
- Stream stdout from scala Process
- Write to process stdin
- Scala: run an external command, and get both the Process and the stdout Stream[String]
- Interaction with spawned process in Scala
- Redirecting stdin and stdout in scala
- Passing process stdout to outside function
- Scala - Fails to execute a process through terminal in a particular scenario
- Running a git command with wildcard works in bash but not through Process
- Ignore stdout in a Scala Process
- Increase network traffic between worker nodes through apache spark's shuffle process
- Logging while testing through Gradle
- Using Either to process failures in Scala code
- Scala Process - Capture Standard Out and Exit Code
- Simulate input from stdin when running a scala program in intellij
- Step by Step / Deep explain: The Power of (Co)Yoneda (preferably in scala) through Coroutines
- Can Scala actors process multiple messages simultaneously?
- Threading extra state through a parser in Scala
- exiting the repl console with a background process
- Finding my way through Scalaz
- how do I process returned Either
- Setting current working directory when executing a shell process
- How to call super method when overriding a method through a trait
- Spark losing println() on stdout
- Forward a file upload stream to S3 through Iteratee with Play2 / Scala
- Grabbing process stderr in scala
- Play! Framework 2.0 - Looping through a map in a scala template?
- Iterate through odd members of collection in Scala
- Scala: Exposing a JDBC ResultSet through a generator (iterable)
More Query from same tag
- How to flip Seq[Free[Functor, A]] to Free[Functor, Seq[A]]
- spark - extract elements from an RDD[Row] when reading Hive table in Spark
- Why do scala REPL and Ammonite behave differently in this point and which behaviour is correct?
- Maven duplicate tag "dependencies" error
- Scala SBT is not able to download dependencies
- how to run scala sbt-native-packager for a appJS/appJVM cross-build project
- Spark Structured Streaming + Kafka Integration: MicroBatchExecution PartitionOffsets Error
- How to use CROSS JOIN and CROSS APPLY in Spark SQL
- Does scalatra use circumflex behind the scenes?
- How does Scala resolve generic types automatically?
- Slick Code Generator Not Including Schema With Table
- scala.sys.process: == doesn't work as expected
- How to sum a list of tuples by keys
- ScalaTest: treat Instants as equal when in the same millisecond
- Scala pass function as argument and execute conditionally
- How do I "slow down" executing of tests in scala (transcation doesn't close in AfterExample)?
- Use SparkSQL when function to select columns
- Scala implicit conversion from Try to Future
- Create a Scala function without explicitly declaring the types
- How to convert array of struct into struct of struct in Spark?
- Can I get a compile-time list of all of the case objects which derive from a sealed parent in Scala?
- SBT: Start a command line 'run' of the main class of a non-default project
- Manage dependencies with sbt & IntelliJ IDEA
- Does there exist in Scala, or a library, an equivalent to Clojure's diff as applied to maps?
- Conditonally UPDATE fields with Slick String interpolation
- "method .. overrides nothing" error when using overloaded method in abstract parent
- connectionsPerHost and maxSize returns 10 when instantiating MongoCleint
- Spark Scala Dataset Type Hierarchy
- Scala - Typeclass example. How to explain the benefits?
- Should I reuse a HashPartitioner on two different RDDs?