SUSE Package Hub 15 SP2 one-click install Install perl-URI-Fetch NOTE: This one-click installation requires that the SUSE Package Hub extension to already be enabled. See http://packagehub.suse.com/how-to-use/ for information on enabling the Package Hub extension If the extension is not enabled, this installation will fail while trying to enable an invalid repo. This package might depend on packages from SUSE Linux Enterprise modules. If those modules are not enabled, a package dependency error will be encountered. SUSE-PackageHub-15-SP2-Backports-Pool Package Hub 15 SP2 Dummy repo - this will fail perl-URI-Fetch Smart URI fetching/caching _URI::Fetch_ is a smart client for fetching HTTP pages, notably syndication feeds (RSS, Atom, and others), in an intelligent, bandwidth- and time-saving way. That means: * * GZIP support If you have _Compress::Zlib_ installed, _URI::Fetch_ will automatically try to download a compressed version of the content, saving bandwidth (and time). * * _Last-Modified_ and _ETag_ support If you use a local cache (see the _Cache_ parameter to _fetch_), _URI::Fetch_ will keep track of the _Last-Modified_ and _ETag_ headers from the server, allowing you to only download pages that have been modified since the last time you checked. * * Proper understanding of HTTP error codes Certain HTTP error codes are special, particularly when fetching syndication feeds, and well-written clients should pay special attention to them. _URI::Fetch_ can only do so much for you in this regard, but it gives you the tools to be a well-written client. The response from _fetch_ gives you the raw HTTP response code, along with special handling of 4 codes: * * 200 (OK) Signals that the content of a page/feed was retrieved successfully. * * 301 (Moved Permanently) Signals that a page/feed has moved permanently, and that your database of feeds should be updated to reflect the new URI. * * 304 (Not Modified) Signals that a page/feed has not changed since it was last fetched. * * 410 (Gone) Signals that a page/feed is gone and will never be coming back, so you should stop trying to fetch it. SUSE Package Hub 15 SP2 one-click install Install perl-URI-Fetch NOTE: This one-click installation requires that the SUSE Package Hub extension to already be enabled. See http://packagehub.suse.com/how-to-use/ for information on enabling the Package Hub extension If the extension is not enabled, this installation will fail while trying to enable an invalid repo. This package might depend on packages from SUSE Linux Enterprise modules. If those modules are not enabled, a package dependency error will be encountered. SUSE-PackageHub-15-SP2-Backports-Pool Package Hub 15 SP2 Dummy repo - this will fail perl-URI-Fetch Smart URI fetching/caching _URI::Fetch_ is a smart client for fetching HTTP pages, notably syndication feeds (RSS, Atom, and others), in an intelligent, bandwidth- and time-saving way. That means: * * GZIP support If you have _Compress::Zlib_ installed, _URI::Fetch_ will automatically try to download a compressed version of the content, saving bandwidth (and time). * * _Last-Modified_ and _ETag_ support If you use a local cache (see the _Cache_ parameter to _fetch_), _URI::Fetch_ will keep track of the _Last-Modified_ and _ETag_ headers from the server, allowing you to only download pages that have been modified since the last time you checked. * * Proper understanding of HTTP error codes Certain HTTP error codes are special, particularly when fetching syndication feeds, and well-written clients should pay special attention to them. _URI::Fetch_ can only do so much for you in this regard, but it gives you the tools to be a well-written client. The response from _fetch_ gives you the raw HTTP response code, along with special handling of 4 codes: * * 200 (OK) Signals that the content of a page/feed was retrieved successfully. * * 301 (Moved Permanently) Signals that a page/feed has moved permanently, and that your database of feeds should be updated to reflect the new URI. * * 304 (Not Modified) Signals that a page/feed has not changed since it was last fetched. * * 410 (Gone) Signals that a page/feed is gone and will never be coming back, so you should stop trying to fetch it. SUSE Package Hub 15 SP3 one-click install Install perl-URI-Fetch NOTE: This one-click installation requires that the SUSE Package Hub extension to already be enabled. See http://packagehub.suse.com/how-to-use/ for information on enabling the Package Hub extension If the extension is not enabled, this installation will fail while trying to enable an invalid repo. This package might depend on packages from SUSE Linux Enterprise modules. If those modules are not enabled, a package dependency error will be encountered. SUSE-PackageHub-15-SP3-Backports-Pool Package Hub 15 SP3 Dummy repo - this will fail perl-URI-Fetch Smart URI fetching/caching _URI::Fetch_ is a smart client for fetching HTTP pages, notably syndication feeds (RSS, Atom, and others), in an intelligent, bandwidth- and time-saving way. That means: * * GZIP support If you have _Compress::Zlib_ installed, _URI::Fetch_ will automatically try to download a compressed version of the content, saving bandwidth (and time). * * _Last-Modified_ and _ETag_ support If you use a local cache (see the _Cache_ parameter to _fetch_), _URI::Fetch_ will keep track of the _Last-Modified_ and _ETag_ headers from the server, allowing you to only download pages that have been modified since the last time you checked. * * Proper understanding of HTTP error codes Certain HTTP error codes are special, particularly when fetching syndication feeds, and well-written clients should pay special attention to them. _URI::Fetch_ can only do so much for you in this regard, but it gives you the tools to be a well-written client. The response from _fetch_ gives you the raw HTTP response code, along with special handling of 4 codes: * * 200 (OK) Signals that the content of a page/feed was retrieved successfully. * * 301 (Moved Permanently) Signals that a page/feed has moved permanently, and that your database of feeds should be updated to reflect the new URI. * * 304 (Not Modified) Signals that a page/feed has not changed since it was last fetched. * * 410 (Gone) Signals that a page/feed is gone and will never be coming back, so you should stop trying to fetch it. SUSE Package Hub 15 SP3 one-click install Install perl-URI-Fetch NOTE: This one-click installation requires that the SUSE Package Hub extension to already be enabled. See http://packagehub.suse.com/how-to-use/ for information on enabling the Package Hub extension If the extension is not enabled, this installation will fail while trying to enable an invalid repo. This package might depend on packages from SUSE Linux Enterprise modules. If those modules are not enabled, a package dependency error will be encountered. SUSE-PackageHub-15-SP3-Backports-Pool Package Hub 15 SP3 Dummy repo - this will fail perl-URI-Fetch Smart URI fetching/caching _URI::Fetch_ is a smart client for fetching HTTP pages, notably syndication feeds (RSS, Atom, and others), in an intelligent, bandwidth- and time-saving way. That means: * * GZIP support If you have _Compress::Zlib_ installed, _URI::Fetch_ will automatically try to download a compressed version of the content, saving bandwidth (and time). * * _Last-Modified_ and _ETag_ support If you use a local cache (see the _Cache_ parameter to _fetch_), _URI::Fetch_ will keep track of the _Last-Modified_ and _ETag_ headers from the server, allowing you to only download pages that have been modified since the last time you checked. * * Proper understanding of HTTP error codes Certain HTTP error codes are special, particularly when fetching syndication feeds, and well-written clients should pay special attention to them. _URI::Fetch_ can only do so much for you in this regard, but it gives you the tools to be a well-written client. The response from _fetch_ gives you the raw HTTP response code, along with special handling of 4 codes: * * 200 (OK) Signals that the content of a page/feed was retrieved successfully. * * 301 (Moved Permanently) Signals that a page/feed has moved permanently, and that your database of feeds should be updated to reflect the new URI. * * 304 (Not Modified) Signals that a page/feed has not changed since it was last fetched. * * 410 (Gone) Signals that a page/feed is gone and will never be coming back, so you should stop trying to fetch it. SUSE Package Hub 15 SP4 one-click install Install perl-URI-Fetch NOTE: This one-click installation requires that the SUSE Package Hub extension to already be enabled. See http://packagehub.suse.com/how-to-use/ for information on enabling the Package Hub extension If the extension is not enabled, this installation will fail while trying to enable an invalid repo. This package might depend on packages from SUSE Linux Enterprise modules. If those modules are not enabled, a package dependency error will be encountered. SUSE-PackageHub-15-SP4-Backports-Pool Package Hub 15 SP4 Dummy repo - this will fail perl-URI-Fetch Smart URI fetching/caching _URI::Fetch_ is a smart client for fetching HTTP pages, notably syndication feeds (RSS, Atom, and others), in an intelligent, bandwidth- and time-saving way. That means: * * GZIP support If you have _Compress::Zlib_ installed, _URI::Fetch_ will automatically try to download a compressed version of the content, saving bandwidth (and time). * * _Last-Modified_ and _ETag_ support If you use a local cache (see the _Cache_ parameter to _fetch_), _URI::Fetch_ will keep track of the _Last-Modified_ and _ETag_ headers from the server, allowing you to only download pages that have been modified since the last time you checked. * * Proper understanding of HTTP error codes Certain HTTP error codes are special, particularly when fetching syndication feeds, and well-written clients should pay special attention to them. _URI::Fetch_ can only do so much for you in this regard, but it gives you the tools to be a well-written client. The response from _fetch_ gives you the raw HTTP response code, along with special handling of 4 codes: * * 200 (OK) Signals that the content of a page/feed was retrieved successfully. * * 301 (Moved Permanently) Signals that a page/feed has moved permanently, and that your database of feeds should be updated to reflect the new URI. * * 304 (Not Modified) Signals that a page/feed has not changed since it was last fetched. * * 410 (Gone) Signals that a page/feed is gone and will never be coming back, so you should stop trying to fetch it. SUSE Package Hub 15 SP4 one-click install Install perl-URI-Fetch NOTE: This one-click installation requires that the SUSE Package Hub extension to already be enabled. See http://packagehub.suse.com/how-to-use/ for information on enabling the Package Hub extension If the extension is not enabled, this installation will fail while trying to enable an invalid repo. This package might depend on packages from SUSE Linux Enterprise modules. If those modules are not enabled, a package dependency error will be encountered. SUSE-PackageHub-15-SP4-Backports-Pool Package Hub 15 SP4 Dummy repo - this will fail perl-URI-Fetch Smart URI fetching/caching _URI::Fetch_ is a smart client for fetching HTTP pages, notably syndication feeds (RSS, Atom, and others), in an intelligent, bandwidth- and time-saving way. That means: * * GZIP support If you have _Compress::Zlib_ installed, _URI::Fetch_ will automatically try to download a compressed version of the content, saving bandwidth (and time). * * _Last-Modified_ and _ETag_ support If you use a local cache (see the _Cache_ parameter to _fetch_), _URI::Fetch_ will keep track of the _Last-Modified_ and _ETag_ headers from the server, allowing you to only download pages that have been modified since the last time you checked. * * Proper understanding of HTTP error codes Certain HTTP error codes are special, particularly when fetching syndication feeds, and well-written clients should pay special attention to them. _URI::Fetch_ can only do so much for you in this regard, but it gives you the tools to be a well-written client. The response from _fetch_ gives you the raw HTTP response code, along with special handling of 4 codes: * * 200 (OK) Signals that the content of a page/feed was retrieved successfully. * * 301 (Moved Permanently) Signals that a page/feed has moved permanently, and that your database of feeds should be updated to reflect the new URI. * * 304 (Not Modified) Signals that a page/feed has not changed since it was last fetched. * * 410 (Gone) Signals that a page/feed is gone and will never be coming back, so you should stop trying to fetch it. SUSE Package Hub 15 SP5 one-click install Install perl-URI-Fetch NOTE: This one-click installation requires that the SUSE Package Hub extension to already be enabled. See http://packagehub.suse.com/how-to-use/ for information on enabling the Package Hub extension If the extension is not enabled, this installation will fail while trying to enable an invalid repo. This package might depend on packages from SUSE Linux Enterprise modules. If those modules are not enabled, a package dependency error will be encountered. SUSE-PackageHub-15-SP5-Standard-Pool Package Hub 15 SP5 Dummy repo - this will fail perl-URI-Fetch Smart URI fetching/caching _URI::Fetch_ is a smart client for fetching HTTP pages, notably syndication feeds (RSS, Atom, and others), in an intelligent, bandwidth- and time-saving way. That means: * * GZIP support If you have _Compress::Zlib_ installed, _URI::Fetch_ will automatically try to download a compressed version of the content, saving bandwidth (and time). * * _Last-Modified_ and _ETag_ support If you use a local cache (see the _Cache_ parameter to _fetch_), _URI::Fetch_ will keep track of the _Last-Modified_ and _ETag_ headers from the server, allowing you to only download pages that have been modified since the last time you checked. * * Proper understanding of HTTP error codes Certain HTTP error codes are special, particularly when fetching syndication feeds, and well-written clients should pay special attention to them. _URI::Fetch_ can only do so much for you in this regard, but it gives you the tools to be a well-written client. The response from _fetch_ gives you the raw HTTP response code, along with special handling of 4 codes: * * 200 (OK) Signals that the content of a page/feed was retrieved successfully. * * 301 (Moved Permanently) Signals that a page/feed has moved permanently, and that your database of feeds should be updated to reflect the new URI. * * 304 (Not Modified) Signals that a page/feed has not changed since it was last fetched. * * 410 (Gone) Signals that a page/feed is gone and will never be coming back, so you should stop trying to fetch it. SUSE Package Hub 15 SP5 one-click install Install perl-URI-Fetch NOTE: This one-click installation requires that the SUSE Package Hub extension to already be enabled. See http://packagehub.suse.com/how-to-use/ for information on enabling the Package Hub extension If the extension is not enabled, this installation will fail while trying to enable an invalid repo. This package might depend on packages from SUSE Linux Enterprise modules. If those modules are not enabled, a package dependency error will be encountered. SUSE-PackageHub-15-SP5-Standard-Pool Package Hub 15 SP5 Dummy repo - this will fail perl-URI-Fetch Smart URI fetching/caching _URI::Fetch_ is a smart client for fetching HTTP pages, notably syndication feeds (RSS, Atom, and others), in an intelligent, bandwidth- and time-saving way. That means: * * GZIP support If you have _Compress::Zlib_ installed, _URI::Fetch_ will automatically try to download a compressed version of the content, saving bandwidth (and time). * * _Last-Modified_ and _ETag_ support If you use a local cache (see the _Cache_ parameter to _fetch_), _URI::Fetch_ will keep track of the _Last-Modified_ and _ETag_ headers from the server, allowing you to only download pages that have been modified since the last time you checked. * * Proper understanding of HTTP error codes Certain HTTP error codes are special, particularly when fetching syndication feeds, and well-written clients should pay special attention to them. _URI::Fetch_ can only do so much for you in this regard, but it gives you the tools to be a well-written client. The response from _fetch_ gives you the raw HTTP response code, along with special handling of 4 codes: * * 200 (OK) Signals that the content of a page/feed was retrieved successfully. * * 301 (Moved Permanently) Signals that a page/feed has moved permanently, and that your database of feeds should be updated to reflect the new URI. * * 304 (Not Modified) Signals that a page/feed has not changed since it was last fetched. * * 410 (Gone) Signals that a page/feed is gone and will never be coming back, so you should stop trying to fetch it.