val Array(_, t) = readLine.split(" ").map(_.toInt) is ok when string contains valid data.
If you know that second token is valid use this:
val t = line.split(" ")(1).toInt
- Reading Second Value In Line Scala
- Reading Multiple Inputs from the Same Line the Scala Way
- Concurrent reading and processing file line by line in Scala
- reading cookie value in play framewrok 2.2 scala template
- Reading CSV file with multi line strings in Scala
- How do you split a column such that first half becomes the column name and the second the column value in Scala Spark?
- Scala Source.fromFile(fileName).getLines().next() misses second character in string line
- matching new line in Scala regex, when reading from file
- Scala Spark not reading ignoring first line header and loading all data from 2nd line onwards
- How to make first line of text file as header and skip second line in spark scala
- how to add a new column having the value of the following line with scala dataframe
- Reading value from JSON string Scala using ScalaJson
- spark scala reading text file with line delimiter
- Error in reading CSV file with null column value using spark scala
- Spark Scala - reducebykey => how to get element second of value array?
- What is the motivation for Scala assignment evaluating to Unit rather than the value assigned?
- Provide schema while reading csv file as a dataframe in Scala Spark
- Accessing value returned by scala futures
- how to remove key value from map in scala
- Scala case match default value
- Scala: How can I replace value in Dataframes using scala
- Scala safe way of converting String to Enumeration value
- Reading files from a directory in Scala
- Overriding Scala Enumeration Value
- Scala case class update value
- Reading TSV into Spark Dataframe with Scala API
- How to check that an array contains a particular value in Scala 2.8?
- Mockito matchers, scala value class and NullPointerException
- New line character in Scala
- Line continuation character in Scala
More Query from same tag
- How to save array data frame output from spark xml in csv format
- Cassandra Phantom - could not find implicit value for parameter helper: com.outworkers.phantom.macros.TableHelper[Users, User]
- minimum and maximum of moving window, better implementation?
- Scala multiple type conformance
- error: ':' expected but identifier found
- Play runtime DI disadvantages
- Parallel execution of multiple functions in Scala and Spark
- Implicit ClassTag in pattern matching
- Processing optional xml attributes in Scala
- Akka actorSelection vs actorOf Difference
- Order fulfilment with Akka FSM, storing state
- Attributes in XML literals only accept String?
- Creating a json in Play/Scala with same keys but different values
- Linked objects with functional style programming?
- split scala code into modules with one module for imports only
- Can a method really be binary compatible with a class-object (scala)?
- Spark Dataset - "edit" parquet file for each row
- What is the most efficient way to find out which subnet an IP belongs to
- GREMLIN for Scala : How to drop edge between two vertex and connect edges between two vertex in single query
- Functional Programming Principles
- Running in issues after running Spark/scala job with single case id
- Google BigQuery Spark Connector: How to ignore unknown values on append
- Lazy computation of items in list until required element is found
- scalaz trying to use Heap , how to override fold function
- How to pass session values to a method feed() in gatling correctly
- Assign values based on pattern matching- Scala
- Scala 2.12 logging compilation too verbosely in SBT 0.13.16
- How to parallelize groupBy
- How to calculate 'Duration' between two 'DateTimes'
- Spark : remove duplicated rows with different values but keep only one row for distinctive row