This may be my newness with Scala in general, but I resolved this issue by moving the case class declaration out of the main method. So the simplified source now looks like this:

package com.bradkarels.simple

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import com.datastax.spark.connector._
import com.datastax.spark.connector.rdd._

object CaseStudy {

  case class SubHuman(id:String, firstname:String, lastname:String, isGoodPerson:Boolean)

  def main(args: Array[String]) {
    val conf = new SparkConf(true)
      .set("", "")

    val sc = new SparkContext("spark://", "simple", conf)

    val foo = sc.cassandraTable[SubHuman]("nicecase", "human").select("id","firstname","lastname","isGoodPerson").toArray

The complete source (updated & fixed) can be found on github

Related Query

More Query from same tag