score:1
use timedate function
from datetime import datetime, timedelta
latest_hour = datetime.now() - timedelta(hours = 1)
you can also split them by year, month, day, hour
latest_hour.year
latest_hour.month
latest_hour.day
latest_hour.hour
Source: stackoverflow.com
Related Query
- Spark - How to get the latest hour in S3 path?
- Spark : How to get the latest file from s3 in the last 10 days
- How to get probabilities corresponding to the class from Spark ML random forest
- How can I get the project path in Scala?
- Get the row corresponding to the latest timestamp in a Spark Dataset using Scala
- Spark - how to get top N of rdd as a new rdd (without collecting at the driver)
- How to get SSSP actual path by apache spark graphX?
- How to get path to the uploaded file
- how to get the sub project path in sbt multi project build
- How to get the table name from Spark SQL Query [PySpark]?
- How does an sbt plugin get a path to a file in the plugin?
- scala spark how to get latest day's record
- How to get the name of a Spark Column as String?
- How can I compile only the Spark Core and Spark Streaming (so that I can get unit test utilities of Streaming)?
- How do I get the Spark 2.0.0 SparkContext and SparkSql functions working in the scalatest/junit test environment?
- how should I express the hdfs path in spark textfile?
- How to get the latest date from listed dates along with the total count?
- How to get a Tuple for the grouped by result on a Spark Dataframe?
- How to get the top-k frequent words in spark without sorting?
- How to get AWS EMR cluster id and step id from inside the spark application step submitted
- How to get access to the full Row in Spark UDF function?
- SCALA: How to use collect function to get the latest modified entry from a dataframe?
- How to get the details of user who submitted the spark job from within the job context?
- how to get the row corresponding to the minimum value of some column in spark scala dataframe
- How to get the file name from DStream of Spark StreamingContext?
- How to get the SQL representation for the query logic of a (derived) Spark DataFrame?
- How can I get the path to the current target directory in my build.sbt
- How to get the row top 1 in Spark Structured Streaming?
- How to get the occurence rate of the specific values with Apache Spark
- Spark Clustering: How to get a similarity measure of the elements within the same cluster?
More Query from same tag
- Spec2/JUnit/Eclipse shows Unrooted Tests with more than two examples
- Adding field to a JSON using Circe
- companion method in List
- How to manually create a class with a configuration for testing
- How do I filter a Spark dataframe by date?
- ClassNotFoundException in SparkStreaming Example
- Scala beginner on compiling and running scala programs on multiple files
- Doing aggregations on when method in spark
- Type mismatch: the type referenced in object not the same as the one defined in a trait that object extends?
- How do I compare abstract WeakTypeTag's unrepresented Generics
- Trait implementations must implement one of two methods
- How to use Scala's this typing, abstract types, etc. to implement a Self type?
- Vertical Partitioning in Scalding
- How to add two Algebras in one interpreter in scala using natural transformation
- Symbolic identifiers in Scala
- Scala save future value to variable
- Cassandra Timeouts with No CPU Usage
- Problem writing to CSV (German characters) in Spark with UTF-8 Encoding
- loop with accumulator on an rdd
- How to call this scala method in play?
- How to check if a character of a string contains an integer in Scala?
- Enforcing precedence in implicit instances in Scala
- DBIOAction cleanUp or asTry do not lead to internal execution
- Retreiving value of second field of an existing RDD based on value from first field of another RDD
- How to create dataset from stored (variable or parameter) Seq
- flatmap(GenTraversableOnce) on Options
- Fail to create SparkContext
- Return null from dayofyear function - Spark SQL
- Overview of Scala method and function syntax
- Scala pattern match on subset of random elements in a List