DefaultParamsReader#
- class pyspark.ml.util.DefaultParamsReader(cls)[source]#
- Specialization of - MLReaderfor- Paramstypes- Default - MLReaderimplementation for transformers and estimators that contain basic (json-serializable) params and no data. This will not handle more complex params or types with data (e.g., models with coefficients).- New in version 2.3.0. - Methods - getAndSetParams(instance, metadata[, skipParams])- Extract Params from metadata, and set them in the instance. - isPythonParamsInstance(metadata)- load(path)- Load the ML instance from the input path. - loadMetadata(path, sc[, expectedClassName])- Load metadata saved using - DefaultParamsWriter.saveMetadata()- loadParamsInstance(path, sc)- Load a - Paramsinstance from the given path, and return it.- session(sparkSession)- Sets the Spark Session to use for saving/loading. - Attributes - Returns the underlying SparkContext. - Returns the user-specified Spark Session or the default. - Methods Documentation - static getAndSetParams(instance, metadata, skipParams=None)[source]#
- Extract Params from metadata, and set them in the instance. 
 - static loadMetadata(path, sc, expectedClassName='')[source]#
- Load metadata saved using - DefaultParamsWriter.saveMetadata()- Parameters
- pathstr
- scpyspark.SparkContextorpyspark.sql.SparkSession
- expectedClassNamestr, optional
- If non empty, this is checked against the loaded metadata. 
 
 
 - static loadParamsInstance(path, sc)[source]#
- Load a - Paramsinstance from the given path, and return it. This assumes the instance inherits from- MLReadable.
 - session(sparkSession)#
- Sets the Spark Session to use for saving/loading. 
 - Attributes Documentation - sc#
- Returns the underlying SparkContext. 
 - sparkSession#
- Returns the user-specified Spark Session or the default.