score:2
i ran into the same issue (on windows server 2012 r2) when i didn't close the stream. all the files i iterated over were open in read mode until the jvm was shut down. however, it did not occur on mac os x and since the stream depends on os-dependent implementations of filesystemprovider
and directorystream
, i assume the issue can be os-dependent, too.
contrary to the @ian mclaird comment, it is mentioned in the files.list()
documentation that
if timely disposal of file system resources is required, the try-with-resources construct should be used to ensure that the stream's close method is invoked after the stream operations are completed.
the returned stream is a directorystream
, whose javadoc says:
a directorystream is opened upon creation and is closed by invoking the close method. closing a directory stream releases any resources associated with the stream. failure to close the stream may result in a resource leak.
my solution was to follow the advice and use the try-with-resources
construct
try (stream<path> filelisting = files.list(directorypath)) {
// use the filelisting stream
}
when i closed the stream properly (used the above try-with-resources
construct), the file handles were immediately released.
if you don't care about getting the files as a stream or you are ok with loading the whole file list into memory and convert it to a stream yourself, you can use the io api:
file directory = new file("/path/to/dir");
file[] files = directory.listfiles();
if (files != null) { // 'files' can be null if 'directory' "does not denote a directory, or if an i/o error occurs."
// use the 'files' array or convert to a stream:
stream<file> filestream = arrays.stream(files);
}
i did not experience any file-locking issues with this one. however, note that both solutions rely on native, os-dependent code, so i advise testing in all environments you would be using.
score:1
you may use the apache fileutils library, which use the old java.io.file.listfiles function internaly :
iterator<file> it = fileutils.iteratefiles(folder, null, true);
while (it.hasnext())
{
file fileentry = (file) it.next();
}
score:4
if that happens why not to use old school java.io.file?
file folder = new file(pathtofolder);
string[] files = folder.list();
tested with lsof
and it looks like no of the listed files is open. you can convert the array to a list or stream afterwards. unless the directory is too large or remote, then i would try to blame path objects and garbage-collect or somehow destroy them.
Source: stackoverflow.com
Related Query
- Iterating files in scala/java in O(1) open file descriptors
- Iterating through files in scala to create values based on the file names
- Iterating over Java collections in Scala
- iterating over Scala collections in Java
- Mixing Scala and Java files in an Eclipse project
- Scala class file vs Java class file
- java vs scala - reading in file on a separate thread
- Using scala parallelism when iterating over a java converted List to immutable
- run compiled Scala Files on Java Virtual Machine
- Scala - Count number of files in directory with defined file extension
- Problems with running Android APK file when merging dex files using Scala
- Scala - Remove files from folder by file suffix name
- How to add Java dependencies to Scala projects's sbt file
- Building a project with mixed Scala and Java source files using Ant - illegal cyclic reference error
- Is it recommended to separate Scala and Java source files for a Maven project?
- Run java source annotation processing on Scala built class files
- Compile file containing java and scala code
- How do I add additional file to class path which is not java or scala file using SBT configuration?
- Read JSON files from multiple line file in spark scala
- Is it possible to use scala objects in java file in android studio?
- Using wildcard to open multiple csv files Spark Scala
- Scala isn't allowing me to execute a batch file whose path contains spaces.Same Java code does.What gives?
- scala jar file executed by java - cp vs. jar
- Convert CSV to Avro file in Java or scala
- open file on windows with scala source.fromFile()
- Compile and Run Scala File within a Java Program
- Scala file writing with Java printwriter - why does the file writer stop in this code?
- Open source projects with Java and Scala interaction
- Splitting a large log file in to multiple files in Scala
- Include java source in scala file
More Query from same tag
- Spark Dataframes , WithColumn
- Play 2.2.2 (Scala), how to handle HttpServletRequests
- cannot launch findbugs appropriately through sbt
- Scala + How to do placeholder replacement in Spark Dataframe Column from file?
- How to substitute actual type arguments in a generic method to obtain final types of its value arguments?
- How do I get args passed to this scala object?
- Existing data for tests is not found Scala Specs2
- Overriding archetype templates in sbt-native-packager?
- How do you indent code block for scaladoc
- Spark map with many String concatenation
- Scala Adapter pattern - autommagically allow "duck typing" for classes with same methods
- Creating a method which returns one or two parameters generically
- Sort Array of structs in Spark DataFrame
- Why can't I use Options inside of a slick query
- Is it possible to parse Json in build.sbt?
- Select case class based on String in Scala
- Serializing a JSON string as JSON in Scala/Play
- transformation on simple json array
- Add days to Timestamp column
- Can a Future throw an exception caught by a caller?
- Either for-comprehension different behavior in 2.9 and 2.10
- Compare the values in different rows of a dataframe and create new dataframe with rows satisfying the conditions
- Scala passing type parameters to object
- variables in play views
- Difference between Identity link type in activiti
- Scala convert model object to DTO
- Scala, cats - convert FUUID with Circe
- Using parquet-mr in Scala without Spark
- Scala Filter a map for based on unique values within Map values
- Compute an average in a RDD and then filter this RDD based on the average in Spark Streaming