This rule raises an issue when trying to access a Scikit-Learn transformer used in a pipeline with caching directly.
When using a pipeline with a cache and passing the transformer objects as an instance from a variable, it is possible to access the transformer objects directly.
This is an issue since all the transformers are cloned when the Pipeline is fitted, and therefore, the objects outside the Pipeline are not updated and will yield unexpected results.
Replace the direct access to the transformer with an access to the named_steps attribute of the pipeline.
from sklearn.datasets import load_diabetes
from sklearn.preprocessing import RobustScaler
from sklearn.neighbors import KNeighborsRegressor
from sklearn.pipeline import Pipeline
diabetes = load_diabetes()
scaler = RobustScaler()
knn = KNeighborsRegressor(n_neighbors=5)
pipeline = Pipeline([
('scaler', scaler),
('knn', knn)
], memory="cache")
pipeline.fit(diabetes.data, diabetes.target)
print(scaler.center_) # Noncompliant : raises an AttributeError
from sklearn.datasets import load_diabetes
from sklearn.preprocessing import RobustScaler
from sklearn.neighbors import KNeighborsRegressor
from sklearn.pipeline import Pipeline
diabetes = load_diabetes()
scaler = RobustScaler()
knn = KNeighborsRegressor(n_neighbors=5)
pipeline = Pipeline([
('scaler', scaler),
('knn', knn)
], memory="cache")
pipeline.fit(diabetes.data, diabetes.target)
print(pipeline.named_steps['scaler'].center_) # Compliant