Skip to main content

Posts

Showing posts with the label Offline Mode

Solving 'OSError: We couldn't connect to huggingface.co' in Offline Mode

  Nothing stops a production deployment faster than an unexpected network call in an environment designed to be isolated. You have carefully containerized your machine learning inference service, verified the model files are inside the Docker image, and deployed it to an air-gapped Kubernetes cluster. Yet, upon startup, the application crashes with the dreaded error: OSError: We couldn't connect to 'https://huggingface.co' to load this model, couldn't find it in the cached files and it looks like <model_id> is not the path to a directory containing a file named config.json. This error is misleading. Often, the files  are  there, but the library is prioritizing a network handshake over the local filesystem. This guide covers the root cause of this behavior in the Hugging Face  transformers  library and provides the production-grade configuration to enforce offline execution. Root Cause Analysis: Why  from_pretrained  Pings the Internet To solve th...