Browse Source

wget package update

Signed-off-by: basebuilder_pel7x64builder0 <basebuilder@powerel.org>
master
basebuilder_pel7x64builder0 6 years ago
parent
commit
06d4ff90e7
  1. 169
      SOURCES/wget-1.12-path.patch
  2. 151
      SOURCES/wget-1.14-CVE-2014-4877.patch
  3. 261
      SOURCES/wget-1.14-CVE-2016-4971.patch
  4. 18
      SOURCES/wget-1.14-CVE-2017-13089.patch
  5. 21
      SOURCES/wget-1.14-CVE-2017-13090.patch
  6. 43
      SOURCES/wget-1.14-CVE-2018-0494.patch
  7. 47
      SOURCES/wget-1.14-Fix-deadcode-and-possible-NULL-use.patch
  8. 122
      SOURCES/wget-1.14-add-openssl-tlsv11-tlsv12-support.patch
  9. 27
      SOURCES/wget-1.14-add_missing_options_doc.patch
  10. 22
      SOURCES/wget-1.14-digest-auth-qop-segfault-fix.patch
  11. 61
      SOURCES/wget-1.14-doc-missing-opts-and-fix-preserve-permissions.patch
  12. 60
      SOURCES/wget-1.14-document-backups.patch
  13. 80
      SOURCES/wget-1.14-fix-backups-to-work-as-documented.patch
  14. 29
      SOURCES/wget-1.14-fix-double-free-of-iri-orig_url.patch
  15. 164
      SOURCES/wget-1.14-fix-synchronization-in-Test-proxied-https-auth.patch
  16. 55
      SOURCES/wget-1.14-manpage-tex5.patch
  17. 26
      SOURCES/wget-1.14-rh1147572.patch
  18. 30
      SOURCES/wget-1.14-rh1203384.patch
  19. 32
      SOURCES/wget-1.14-set_sock_to_-1_if_no_persistent_conn.patch
  20. 105
      SOURCES/wget-1.14-sslreadtimeout.patch
  21. 154
      SOURCES/wget-1.14-support-non-ASCII-characters.patch
  22. 25
      SOURCES/wget-1.14-texi2pod_error_perl518.patch
  23. 11
      SOURCES/wget-rh-modified.patch
  24. 590
      SPECS/wget.spec

169
SOURCES/wget-1.12-path.patch

@ -0,0 +1,169 @@ @@ -0,0 +1,169 @@
diff -urN wget-1.12/doc/sample.wgetrc wget-1.12.patched/doc/sample.wgetrc
--- wget-1.12/doc/sample.wgetrc 2009-09-22 04:53:58.000000000 +0200
+++ wget-1.12.patched/doc/sample.wgetrc 2009-11-17 12:29:18.000000000 +0100
@@ -7,7 +7,7 @@
## not contain a comprehensive list of commands -- look at the manual
## to find out what you can put into this file.
##
-## Wget initialization file can reside in /usr/local/etc/wgetrc
+## Wget initialization file can reside in /etc/wgetrc
## (global, for all users) or $HOME/.wgetrc (for a single user).
##
## To use the settings in this file, you will have to uncomment them,
@@ -16,7 +16,7 @@
##
-## Global settings (useful for setting up in /usr/local/etc/wgetrc).
+## Global settings (useful for setting up in /etc/wgetrc).
## Think well before you change them, since they may reduce wget's
## functionality, and make it behave contrary to the documentation:
##
diff -urN wget-1.12/doc/sample.wgetrc.munged_for_texi_inclusion wget-1.12.patched/doc/sample.wgetrc.munged_for_texi_inclusion
--- wget-1.12/doc/sample.wgetrc.munged_for_texi_inclusion 2009-09-22 06:08:52.000000000 +0200
+++ wget-1.12.patched/doc/sample.wgetrc.munged_for_texi_inclusion 2009-11-17 12:29:39.000000000 +0100
@@ -7,7 +7,7 @@
## not contain a comprehensive list of commands -- look at the manual
## to find out what you can put into this file.
##
-## Wget initialization file can reside in /usr/local/etc/wgetrc
+## Wget initialization file can reside in /etc/wgetrc
## (global, for all users) or $HOME/.wgetrc (for a single user).
##
## To use the settings in this file, you will have to uncomment them,
@@ -16,7 +16,7 @@
##
-## Global settings (useful for setting up in /usr/local/etc/wgetrc).
+## Global settings (useful for setting up in /etc/wgetrc).
## Think well before you change them, since they may reduce wget's
## functionality, and make it behave contrary to the documentation:
##
diff -urN wget-1.12/doc/wget.info wget-1.12.patched/doc/wget.info
--- wget-1.12/doc/wget.info 2009-09-22 18:30:20.000000000 +0200
+++ wget-1.12.patched/doc/wget.info 2009-11-17 12:28:40.000000000 +0100
@@ -2351,8 +2351,8 @@
===================
When initializing, Wget will look for a "global" startup file,
-`/usr/local/etc/wgetrc' by default (or some prefix other than
-`/usr/local', if Wget was not installed there) and read commands from
+`/etc/wgetrc' by default (or some prefix other than
+`/etc', if Wget was not installed there) and read commands from
there, if it exists.
Then it will look for the user's file. If the environmental variable
@@ -2363,7 +2363,7 @@
The fact that user's settings are loaded after the system-wide ones
means that in case of collision user's wgetrc _overrides_ the
-system-wide wgetrc (in `/usr/local/etc/wgetrc' by default). Fascist
+system-wide wgetrc (in `/etc/wgetrc' by default). Fascist
admins, away!

@@ -2876,7 +2876,7 @@
## not contain a comprehensive list of commands -- look at the manual
## to find out what you can put into this file.
##
- ## Wget initialization file can reside in /usr/local/etc/wgetrc
+ ## Wget initialization file can reside in /etc/wgetrc
## (global, for all users) or $HOME/.wgetrc (for a single user).
##
## To use the settings in this file, you will have to uncomment them,
@@ -2885,7 +2885,7 @@
##
- ## Global settings (useful for setting up in /usr/local/etc/wgetrc).
+ ## Global settings (useful for setting up in /etc/wgetrc).
## Think well before you change them, since they may reduce wget's
## functionality, and make it behave contrary to the documentation:
##
diff -urN wget-1.12/doc/wget.texi wget-1.12.patched/doc/wget.texi
--- wget-1.12/doc/wget.texi 2009-09-04 23:22:04.000000000 +0200
+++ wget-1.12.patched/doc/wget.texi 2009-11-17 12:29:03.000000000 +0100
@@ -2670,8 +2670,8 @@
@cindex location of wgetrc
When initializing, Wget will look for a @dfn{global} startup file,
-@file{/usr/local/etc/wgetrc} by default (or some prefix other than
-@file{/usr/local}, if Wget was not installed there) and read commands
+@file{/etc/wgetrc} by default (or some prefix other than
+@file{/etc}, if Wget was not installed there) and read commands
from there, if it exists.
Then it will look for the user's file. If the environmental variable
@@ -2682,7 +2682,7 @@
The fact that user's settings are loaded after the system-wide ones
means that in case of collision user's wgetrc @emph{overrides} the
-system-wide wgetrc (in @file{/usr/local/etc/wgetrc} by default).
+system-wide wgetrc (in @file{/etc/wgetrc} by default).
Fascist admins, away!
@node Wgetrc Syntax, Wgetrc Commands, Wgetrc Location, Startup File
diff -urN wget-1.12/NEWS wget-1.12.patched/NEWS
--- wget-1.12/NEWS 2009-09-22 04:53:35.000000000 +0200
+++ wget-1.12.patched/NEWS 2009-11-17 12:30:10.000000000 +0100
@@ -562,7 +562,7 @@
** Compiles on pre-ANSI compilers.
-** Global wgetrc now goes to /usr/local/etc (i.e. $sysconfdir).
+** Global wgetrc now goes to /etc (i.e. $sysconfdir).
** Lots of bugfixes.
@@ -625,7 +625,7 @@
** Fixed a long-standing bug, so that Wget now works over SLIP
connections.
-** You can have a system-wide wgetrc (/usr/local/lib/wgetrc by
+** You can have a system-wide wgetrc (/etc/wgetrc by
default). Settings in $HOME/.wgetrc override the global ones, of
course :-)
diff -urN wget-1.12/README wget-1.12.patched/README
--- wget-1.12/README 2009-09-21 00:59:32.000000000 +0200
+++ wget-1.12.patched/README 2009-11-17 12:30:27.000000000 +0100
@@ -33,7 +33,7 @@
Most of the features are configurable, either through command-line
options, or via initialization file .wgetrc. Wget allows you to
-install a global startup file (/usr/local/etc/wgetrc by default) for
+install a global startup file (/etc/wgetrc by default) for
site settings.
Wget works under almost all Unix variants in use today and, unlike
--- wget-1.12/doc/wget.info.start 2011-12-19 10:34:29.409272713 -0600
+++ wget-1.12/doc/wget.info 2011-12-19 10:34:51.760129197 -0600
@@ -113,7 +113,7 @@
* Most of the features are fully configurable, either through
command line options, or via the initialization file `.wgetrc'
(*note Startup File::). Wget allows you to define "global"
- startup files (`/usr/local/etc/wgetrc' by default) for site
+ startup files (`/etc/wgetrc' by default) for site
settings. You can also specify the location of a startup file with
the -config option.
--- wget-1.12/doc/wget.texi.start 2011-12-19 10:38:18.305730849 -0600
+++ wget-1.12/doc/wget.texi 2011-12-19 10:38:49.272615753 -0600
@@ -190,14 +190,14 @@
Most of the features are fully configurable, either through command line
options, or via the initialization file @file{.wgetrc} (@pxref{Startup
File}). Wget allows you to define @dfn{global} startup files
-(@file{/usr/local/etc/wgetrc} by default) for site settings. You can also
+(@file{/etc/wgetrc} by default) for site settings. You can also
specify the location of a startup file with the --config option.
@ignore
@c man begin FILES
@table @samp
-@item /usr/local/etc/wgetrc
+@item /etc/wgetrc
Default location of the @dfn{global} startup file.
@item .wgetrc

151
SOURCES/wget-1.14-CVE-2014-4877.patch

@ -0,0 +1,151 @@ @@ -0,0 +1,151 @@
From 043366ac3248a58662a6fbf47a1dd688a75d0e78 Mon Sep 17 00:00:00 2001
From: Darshit Shah <darnir@gmail.com>
Date: Mon, 8 Sep 2014 00:41:17 +0530
Subject: [PATCH 1/2] Fix R7-2014-15: Arbitrary Symlink Access

Wget was susceptible to a symlink attack which could create arbitrary
files, directories or symbolic links and set their permissions when
retrieving a directory recursively through FTP. This commit changes the
default settings in Wget such that Wget no longer creates local symbolic
links, but rather traverses them and retrieves the pointed-to file in
such a retrieval.

The old behaviour can be attained by passing the --retr-symlinks=no
option to the Wget invokation command.
---
doc/wget.texi | 23 ++++++++++++-----------
src/init.c | 16 ++++++++++++++++
2 files changed, 28 insertions(+), 11 deletions(-)

diff --git a/doc/wget.texi b/doc/wget.texi
index a31eb5e..f54e98d 100644
--- a/doc/wget.texi
+++ b/doc/wget.texi
@@ -1883,17 +1883,18 @@ Preserve remote file permissions instead of permissions set by umask.
@cindex symbolic links, retrieving
@item --retr-symlinks
-Usually, when retrieving @sc{ftp} directories recursively and a symbolic
-link is encountered, the linked-to file is not downloaded. Instead, a
-matching symbolic link is created on the local filesystem. The
-pointed-to file will not be downloaded unless this recursive retrieval
-would have encountered it separately and downloaded it anyway.
-
-When @samp{--retr-symlinks} is specified, however, symbolic links are
-traversed and the pointed-to files are retrieved. At this time, this
-option does not cause Wget to traverse symlinks to directories and
-recurse through them, but in the future it should be enhanced to do
-this.
+By default, when retrieving @sc{ftp} directories recursively and a symbolic link
+is encountered, the symbolic link is traversed and the pointed-to files are
+retrieved. Currently, Wget does not traverse symbolic links to directories to
+download them recursively, though this feature may be added in the future.
+
+When @samp{--retr-symlinks=no} is specified, the linked-to file is not
+downloaded. Instead, a matching symbolic link is created on the local
+filesystem. The pointed-to file will not be retrieved unless this recursive
+retrieval would have encountered it separately and downloaded it anyway. This
+option poses a security risk where a malicious FTP Server may cause Wget to
+write to files outside of the intended directories through a specially crafted
+@sc{.listing} file.
Note that when retrieving a file (not a directory) because it was
specified on the command-line, rather than because it was recursed to,
diff --git a/src/init.c b/src/init.c
index 93e95f8..94b6f8b 100644
--- a/src/init.c
+++ b/src/init.c
@@ -366,6 +366,22 @@ defaults (void)
opt.dns_cache = true;
opt.ftp_pasv = true;
+ /* 2014-09-07 Darshit Shah <darnir@gmail.com>
+ * opt.retr_symlinks is set to true by default. Creating symbolic links on the
+ * local filesystem pose a security threat by malicious FTP Servers that
+ * server a specially crafted .listing file akin to this:
+ *
+ * lrwxrwxrwx 1 root root 33 Dec 25 2012 JoCxl6d8rFU -> /
+ * drwxrwxr-x 15 1024 106 4096 Aug 28 02:02 JoCxl6d8rFU
+ *
+ * A .listing file in this fashion makes Wget susceptiple to a symlink attack
+ * wherein the attacker is able to create arbitrary files, directories and
+ * symbolic links on the target system and even set permissions.
+ *
+ * Hence, by default Wget attempts to retrieve the pointed-to files and does
+ * not create the symbolic links locally.
+ */
+ opt.retr_symlinks = true;
#ifdef HAVE_SSL
opt.check_cert = true;
--
2.1.0

From bfa8c9cc9937f686a4de110e49710061267f8d9e Mon Sep 17 00:00:00 2001
From: Darshit Shah <darnir@gmail.com>
Date: Mon, 8 Sep 2014 15:07:45 +0530
Subject: [PATCH 2/2] Add checks for valid listing file in FTP

When Wget retrieves a file through FTP, it first downloads a .listing
file and parses it for information about the files and other metadata.
Some servers may serve invalid .listing files. This patch checks for one
such known inconsistency wherein multiple lines in a listing file have
the same name. Such a filesystem is clearly not possible and hence we
eliminate duplicate entries here.

Signed-off-by: Darshit Shah <darnir@gmail.com>
---
src/ftp.c | 27 +++++++++++++++++++++++++--
1 file changed, 25 insertions(+), 2 deletions(-)

diff --git a/src/ftp.c b/src/ftp.c
index 2d54333..054cb61 100644
--- a/src/ftp.c
+++ b/src/ftp.c
@@ -2211,6 +2211,29 @@ has_insecure_name_p (const char *s)
return false;
}
+/* Test if the file node is invalid. This can occur due to malformed or
+ * maliciously crafted listing files being returned by the server.
+ *
+ * Currently, this function only tests if there are multiple entries in the
+ * listing file by the same name. However this function can be expanded as more
+ * such illegal listing formats are discovered. */
+static bool
+is_invalid_entry (struct fileinfo *f)
+{
+ struct fileinfo *cur;
+ cur = f;
+ char *f_name = f->name;
+ /* If the node we're currently checking has a duplicate later, we eliminate
+ * the current node and leave the next one intact. */
+ while (cur->next)
+ {
+ cur = cur->next;
+ if (strcmp(f_name, cur->name) == 0)
+ return true;
+ }
+ return false;
+}
+
/* A near-top-level function to retrieve the files in a directory.
The function calls ftp_get_listing, to get a linked list of files.
Then it weeds out the file names that do not match the pattern.
@@ -2248,11 +2271,11 @@ ftp_retrieve_glob (struct url *u, ccon *con, int action)
f = f->next;
}
}
- /* Remove all files with possible harmful names */
+ /* Remove all files with possible harmful names or invalid entries. */
f = start;
while (f)
{
- if (has_insecure_name_p (f->name))
+ if (has_insecure_name_p (f->name) || is_invalid_entry (f))
{
logprintf (LOG_VERBOSE, _("Rejecting %s.\n"),
quote (f->name));
--
2.1.0

261
SOURCES/wget-1.14-CVE-2016-4971.patch

@ -0,0 +1,261 @@ @@ -0,0 +1,261 @@
diff --git a/src/ftp.c b/src/ftp.c
index 2be2c76..345718f 100644
--- a/src/ftp.c
+++ b/src/ftp.c
@@ -234,14 +234,15 @@ print_length (wgint size, wgint start, bool authoritative)
logputs (LOG_VERBOSE, !authoritative ? _(" (unauthoritative)\n") : "\n");
}
-static uerr_t ftp_get_listing (struct url *, ccon *, struct fileinfo **);
+static uerr_t ftp_get_listing (struct url *, struct url *, ccon *, struct fileinfo **);
/* Retrieves a file with denoted parameters through opening an FTP
connection to the server. It always closes the data connection,
and closes the control connection in case of error. If warc_tmp
is non-NULL, the downloaded data will be written there as well. */
static uerr_t
-getftp (struct url *u, wgint passed_expected_bytes, wgint *qtyread,
+getftp (struct url *u, struct url *original_url,
+ wgint passed_expected_bytes, wgint *qtyread,
wgint restval, ccon *con, int count, FILE *warc_tmp)
{
int csock, dtsock, local_sock, res;
@@ -944,7 +945,7 @@ Error in server response, closing control connection.\n"));
bool exists = false;
uerr_t res;
struct fileinfo *f;
- res = ftp_get_listing (u, con, &f);
+ res = ftp_get_listing (u, original_url, con, &f);
/* Set the DO_RETR command flag again, because it gets unset when
calling ftp_get_listing() and would otherwise cause an assertion
failure earlier on when this function gets repeatedly called
@@ -1392,7 +1393,8 @@ Error in server response, closing control connection.\n"));
This loop either gets commands from con, or (if ON_YOUR_OWN is
set), makes them up to retrieve the file given by the URL. */
static uerr_t
-ftp_loop_internal (struct url *u, struct fileinfo *f, ccon *con, char **local_file)
+ftp_loop_internal (struct url *u, struct url *original_url, struct fileinfo *f,
+ ccon *con, char **local_file)
{
int count, orig_lp;
wgint restval, len = 0, qtyread = 0;
@@ -1415,7 +1417,7 @@ ftp_loop_internal (struct url *u, struct fileinfo *f, ccon *con, char **local_fi
else
{
/* URL-derived file. Consider "-O file" name. */
- con->target = url_file_name (u, NULL);
+ con->target = url_file_name (opt.trustservernames || !original_url ? u : original_url, NULL);
if (!opt.output_document)
locf = con->target;
else
@@ -1524,7 +1526,7 @@ ftp_loop_internal (struct url *u, struct fileinfo *f, ccon *con, char **local_fi
/* If we are working on a WARC record, getftp should also write
to the warc_tmp file. */
- err = getftp (u, len, &qtyread, restval, con, count, warc_tmp);
+ err = getftp (u, original_url, len, &qtyread, restval, con, count, warc_tmp);
if (con->csock == -1)
con->st &= ~DONE_CWD;
@@ -1677,7 +1679,8 @@ Removing file due to --delete-after in ftp_loop_internal():\n"));
/* Return the directory listing in a reusable format. The directory
is specifed in u->dir. */
static uerr_t
-ftp_get_listing (struct url *u, ccon *con, struct fileinfo **f)
+ftp_get_listing (struct url *u, struct url *original_url, ccon *con,
+ struct fileinfo **f)
{
uerr_t err;
char *uf; /* url file name */
@@ -1698,7 +1701,7 @@ ftp_get_listing (struct url *u, ccon *con, struct fileinfo **f)
con->target = xstrdup (lf);
xfree (lf);
- err = ftp_loop_internal (u, NULL, con, NULL);
+ err = ftp_loop_internal (u, original_url, NULL, con, NULL);
lf = xstrdup (con->target);
xfree (con->target);
con->target = old_target;
@@ -1721,8 +1724,9 @@ ftp_get_listing (struct url *u, ccon *con, struct fileinfo **f)
return err;
}
-static uerr_t ftp_retrieve_dirs (struct url *, struct fileinfo *, ccon *);
-static uerr_t ftp_retrieve_glob (struct url *, ccon *, int);
+static uerr_t ftp_retrieve_dirs (struct url *, struct url *,
+ struct fileinfo *, ccon *);
+static uerr_t ftp_retrieve_glob (struct url *, struct url *, ccon *, int);
static struct fileinfo *delelement (struct fileinfo *, struct fileinfo **);
static void freefileinfo (struct fileinfo *f);
@@ -1734,7 +1738,8 @@ static void freefileinfo (struct fileinfo *f);
If opt.recursive is set, after all files have been retrieved,
ftp_retrieve_dirs will be called to retrieve the directories. */
static uerr_t
-ftp_retrieve_list (struct url *u, struct fileinfo *f, ccon *con)
+ftp_retrieve_list (struct url *u, struct url *original_url,
+ struct fileinfo *f, ccon *con)
{
static int depth = 0;
uerr_t err;
@@ -1893,7 +1898,9 @@ Already have correct symlink %s -> %s\n\n"),
else /* opt.retr_symlinks */
{
if (dlthis)
- err = ftp_loop_internal (u, f, con, NULL);
+ {
+ err = ftp_loop_internal (u, original_url, f, con, NULL);
+ }
} /* opt.retr_symlinks */
break;
case FT_DIRECTORY:
@@ -1904,7 +1911,9 @@ Already have correct symlink %s -> %s\n\n"),
case FT_PLAINFILE:
/* Call the retrieve loop. */
if (dlthis)
- err = ftp_loop_internal (u, f, con, NULL);
+ {
+ err = ftp_loop_internal (u, original_url, f, con, NULL);
+ }
break;
case FT_UNKNOWN:
logprintf (LOG_NOTQUIET, _("%s: unknown/unsupported file type.\n"),
@@ -1969,7 +1978,7 @@ Already have correct symlink %s -> %s\n\n"),
/* We do not want to call ftp_retrieve_dirs here */
if (opt.recursive &&
!(opt.reclevel != INFINITE_RECURSION && depth >= opt.reclevel))
- err = ftp_retrieve_dirs (u, orig, con);
+ err = ftp_retrieve_dirs (u, original_url, orig, con);
else if (opt.recursive)
DEBUGP ((_("Will not retrieve dirs since depth is %d (max %d).\n"),
depth, opt.reclevel));
@@ -1982,7 +1991,8 @@ Already have correct symlink %s -> %s\n\n"),
ftp_retrieve_glob on each directory entry. The function knows
about excluded directories. */
static uerr_t
-ftp_retrieve_dirs (struct url *u, struct fileinfo *f, ccon *con)
+ftp_retrieve_dirs (struct url *u, struct url *original_url,
+ struct fileinfo *f, ccon *con)
{
char *container = NULL;
int container_size = 0;
@@ -2032,7 +2042,7 @@ Not descending to %s as it is excluded/not-included.\n"),
odir = xstrdup (u->dir); /* because url_set_dir will free
u->dir. */
url_set_dir (u, newdir);
- ftp_retrieve_glob (u, con, GLOB_GETALL);
+ ftp_retrieve_glob (u, original_url, con, GLOB_GETALL);
url_set_dir (u, odir);
xfree (odir);
@@ -2091,14 +2101,15 @@ is_invalid_entry (struct fileinfo *f)
GLOB_GLOBALL, use globbing; if it's GLOB_GETALL, download the whole
directory. */
static uerr_t
-ftp_retrieve_glob (struct url *u, ccon *con, int action)
+ftp_retrieve_glob (struct url *u, struct url *original_url,
+ ccon *con, int action)
{
struct fileinfo *f, *start;
uerr_t res;
con->cmd |= LEAVE_PENDING;
- res = ftp_get_listing (u, con, &start);
+ res = ftp_get_listing (u, original_url, con, &start);
if (res != RETROK)
return res;
/* First: weed out that do not conform the global rules given in
@@ -2194,7 +2205,7 @@ ftp_retrieve_glob (struct url *u, ccon *con, int action)
if (start)
{
/* Just get everything. */
- res = ftp_retrieve_list (u, start, con);
+ res = ftp_retrieve_list (u, original_url, start, con);
}
else
{
@@ -2210,7 +2221,7 @@ ftp_retrieve_glob (struct url *u, ccon *con, int action)
{
/* Let's try retrieving it anyway. */
con->st |= ON_YOUR_OWN;
- res = ftp_loop_internal (u, NULL, con, NULL);
+ res = ftp_loop_internal (u, original_url, NULL, con, NULL);
return res;
}
@@ -2230,8 +2241,8 @@ ftp_retrieve_glob (struct url *u, ccon *con, int action)
of URL. Inherently, its capabilities are limited on what can be
encoded into a URL. */
uerr_t
-ftp_loop (struct url *u, char **local_file, int *dt, struct url *proxy,
- bool recursive, bool glob)
+ftp_loop (struct url *u, struct url *original_url, char **local_file, int *dt,
+ struct url *proxy, bool recursive, bool glob)
{
ccon con; /* FTP connection */
uerr_t res;
@@ -2252,16 +2263,17 @@ ftp_loop (struct url *u, char **local_file, int *dt, struct url *proxy,
if (!*u->file && !recursive)
{
struct fileinfo *f;
- res = ftp_get_listing (u, &con, &f);
+ res = ftp_get_listing (u, original_url, &con, &f);
if (res == RETROK)
{
if (opt.htmlify && !opt.spider)
{
+ struct url *url_file = opt.trustservernames ? u : original_url;
char *filename = (opt.output_document
? xstrdup (opt.output_document)
: (con.target ? xstrdup (con.target)
- : url_file_name (u, NULL)));
+ : url_file_name (url_file, NULL)));
res = ftp_index (filename, u, f);
if (res == FTPOK && opt.verbose)
{
@@ -2306,11 +2318,13 @@ ftp_loop (struct url *u, char **local_file, int *dt, struct url *proxy,
/* ftp_retrieve_glob is a catch-all function that gets called
if we need globbing, time-stamping, recursion or preserve
permissions. Its third argument is just what we really need. */
- res = ftp_retrieve_glob (u, &con,
+ res = ftp_retrieve_glob (u, original_url, &con,
ispattern ? GLOB_GLOBALL : GLOB_GETONE);
}
else
- res = ftp_loop_internal (u, NULL, &con, local_file);
+ {
+ res = ftp_loop_internal (u, original_url, NULL, &con, local_file);
+ }
}
if (res == FTPOK)
res = RETROK;
diff --git a/src/ftp.h b/src/ftp.h
index be00d88..2abc9c0 100644
--- a/src/ftp.h
+++ b/src/ftp.h
@@ -129,7 +129,8 @@ enum wget_ftp_fstatus
};
struct fileinfo *ftp_parse_ls (const char *, const enum stype);
-uerr_t ftp_loop (struct url *, char **, int *, struct url *, bool, bool);
+uerr_t ftp_loop (struct url *, struct url *, char **, int *, struct url *,
+ bool, bool);
uerr_t ftp_index (const char *, struct url *, struct fileinfo *);
diff --git a/src/retr.c b/src/retr.c
index 66624dc..21fad56 100644
--- a/src/retr.c
+++ b/src/retr.c
@@ -794,7 +794,8 @@ retrieve_url (struct url * orig_parsed, const char *origurl, char **file,
if (redirection_count)
oldrec = glob = false;
- result = ftp_loop (u, &local_file, dt, proxy_url, recursive, glob);
+ result = ftp_loop (u, orig_parsed, &local_file, dt, proxy_url,
+ recursive, glob);
recursive = oldrec;
/* There is a possibility of having HTTP being redirected to

18
SOURCES/wget-1.14-CVE-2017-13089.patch

@ -0,0 +1,18 @@ @@ -0,0 +1,18 @@
@@ -, +, @@
(CVE-2017-13089)
---
src/http.c | 3 +++
1 file changed, 3 insertions(+)
--- a/src/http.c
+++ a/src/http.c
@@ -973,6 +973,9 @@ skip_short_body (int fd, wgint contlen, bool chunked)
remaining_chunk_size = strtol (line, &endl, 16);
xfree (line);
+ if (remaining_chunk_size < 0)
+ return false;
+
if (remaining_chunk_size == 0)
{
line = fd_read_line (fd);
--

21
SOURCES/wget-1.14-CVE-2017-13090.patch

@ -0,0 +1,21 @@ @@ -0,0 +1,21 @@
@@ -, +, @@
(CVE-2017-13090)
---
src/retr.c | 6 ++++++
1 file changed, 6 insertions(+)
--- a/src/retr.c
+++ a/src/retr.c
@@ -378,6 +378,12 @@ fd_read_body (const char *downloaded_filename, int fd, FILE *out, wgint toread,
remaining_chunk_size = strtol (line, &endl, 16);
xfree (line);
+ if (remaining_chunk_size < 0)
+ {
+ ret = -1;
+ break;
+ }
+
if (remaining_chunk_size == 0)
{
ret = 0;
--

43
SOURCES/wget-1.14-CVE-2018-0494.patch

@ -0,0 +1,43 @@ @@ -0,0 +1,43 @@
diff --git a/src/http.c b/src/http.c
index b45c404..aa4fd25 100644
--- a/src/http.c
+++ b/src/http.c
@@ -605,9 +605,9 @@ struct response {
resp_header_*. */
static struct response *
-resp_new (const char *head)
+resp_new (char *head)
{
- const char *hdr;
+ char *hdr;
int count, size;
struct response *resp = xnew0 (struct response);
@@ -636,15 +636,23 @@ resp_new (const char *head)
break;
/* Find the end of HDR, including continuations. */
- do
+ for (;;)
{
- const char *end = strchr (hdr, '\n');
+ char *end = strchr (hdr, '\n');
+
if (end)
hdr = end + 1;
else
hdr += strlen (hdr);
+
+ if (*hdr != ' ' && *hdr != '\t')
+ break;
+
+ // continuation, transform \r and \n into spaces
+ *end = ' ';
+ if (end > head && end[-1] == '\r')
+ end[-1] = ' ';
}
- while (*hdr == ' ' || *hdr == '\t');
}
DO_REALLOC (resp->headers, size, count + 1, const char *);
resp->headers[count] = NULL;

47
SOURCES/wget-1.14-Fix-deadcode-and-possible-NULL-use.patch

@ -0,0 +1,47 @@ @@ -0,0 +1,47 @@
From 613d8639c48b950f76d132b70d27e518ba6d6891 Mon Sep 17 00:00:00 2001
From: Tomas Hozza <thozza@redhat.com>
Date: Fri, 26 Apr 2013 14:42:30 +0200
Subject: [PATCH] Fix using deadcode and possible use of NULL pointer

Fix for deadcode in unique_create() so that "opened_name" parameter is
always initialized to a valid string or NULL when returning from
function.

Fix for redirect_output() so that "logfile" is not blindly used in
fprintf() call and checked if it is not NULL.

Signed-off-by: Tomas Hozza <thozza@redhat.com>
---
src/log.c | 2 +-
src/utils.c | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/src/log.c b/src/log.c
index 0185df1..4f93a21 100644
--- a/src/log.c
+++ b/src/log.c
@@ -871,7 +871,7 @@ redirect_output (void)
can do but disable printing completely. */
fprintf (stderr, _("\n%s received.\n"), redirect_request_signal_name);
fprintf (stderr, _("%s: %s; disabling logging.\n"),
- logfile, strerror (errno));
+ (logfile) ? logfile : DEFAULT_LOGFILE, strerror (errno));
inhibit_logging = true;
}
save_context_p = false;
diff --git a/src/utils.c b/src/utils.c
index 567dc35..7cc942f 100644
--- a/src/utils.c
+++ b/src/utils.c
@@ -703,7 +703,7 @@ unique_create (const char *name, bool binary, char **opened_name)
xfree (uname);
uname = unique_name (name, false);
}
- if (opened_name && fp != NULL)
+ if (opened_name)
{
if (fp)
*opened_name = uname;
--
1.8.1.4

122
SOURCES/wget-1.14-add-openssl-tlsv11-tlsv12-support.patch

@ -0,0 +1,122 @@ @@ -0,0 +1,122 @@
diff --git a/doc/wget.texi b/doc/wget.texi
index 118fce9..3bd8dd7 100644
--- a/doc/wget.texi
+++ b/doc/wget.texi
@@ -1555,16 +1555,17 @@ without SSL support, none of these options are available.
@cindex SSL protocol, choose
@item --secure-protocol=@var{protocol}
Choose the secure protocol to be used. Legal values are @samp{auto},
-@samp{SSLv2}, @samp{SSLv3}, and @samp{TLSv1}. If @samp{auto} is used,
-the SSL library is given the liberty of choosing the appropriate
-protocol automatically, which is achieved by sending an SSLv2 greeting
-and announcing support for SSLv3 and TLSv1. This is the default.
-
-Specifying @samp{SSLv2}, @samp{SSLv3}, or @samp{TLSv1} forces the use
-of the corresponding protocol. This is useful when talking to old and
-buggy SSL server implementations that make it hard for OpenSSL to
-choose the correct protocol version. Fortunately, such servers are
-quite rare.
+@samp{SSLv2}, @samp{SSLv3}, @samp{TLSv1}, @samp{TLSv1_1} and
+@samp{TLSv1_2}. If @samp{auto} is used, the SSL library is given the
+liberty of choosing the appropriate protocol automatically, which is
+achieved by sending a SSLv2 greeting and announcing support for SSLv3
+and TLSv1. This is the default.
+
+Specifying @samp{SSLv2}, @samp{SSLv3}, @samp{TLSv1}, @samp{TLSv1_1} or
+@samp{TLSv1_2} forces the use of the corresponding protocol. This is
+useful when talking to old and buggy SSL server implementations that
+make it hard for the underlying SSL library to choose the correct
+protocol version. Fortunately, such servers are quite rare.
@cindex SSL certificate, check
@item --no-check-certificate
diff --git a/src/init.c b/src/init.c
index 4cee677..f160bec 100644
--- a/src/init.c
+++ b/src/init.c
@@ -1488,6 +1488,8 @@ cmd_spec_secure_protocol (const char *com, const char *val, void *place)
{ "sslv2", secure_protocol_sslv2 },
{ "sslv3", secure_protocol_sslv3 },
{ "tlsv1", secure_protocol_tlsv1 },
+ { "tlsv1_1", secure_protocol_tlsv1_1 },
+ { "tlsv1_2", secure_protocol_tlsv1_2 },
};
int ok = decode_string (val, choices, countof (choices), place);
if (!ok)
diff --git a/src/main.c b/src/main.c
index 9cbad9f..3d50dad 100644
--- a/src/main.c
+++ b/src/main.c
@@ -625,7 +625,7 @@ HTTP options:\n"),
HTTPS (SSL/TLS) options:\n"),
N_("\
--secure-protocol=PR choose secure protocol, one of auto, SSLv2,\n\
- SSLv3, and TLSv1.\n"),
+ SSLv3, TLSv1, TLSv1_1 and TLSv1_2.\n"),
N_("\
--no-check-certificate don't validate the server's certificate.\n"),
N_("\
diff --git a/src/openssl.c b/src/openssl.c
index b3c31ce..141a8a3 100644
--- a/src/openssl.c
+++ b/src/openssl.c
@@ -40,6 +40,9 @@ as that of the covered work. */
#include <openssl/x509v3.h>
#include <openssl/err.h>
#include <openssl/rand.h>
+#if OPENSSL_VERSION_NUMBER >= 0x00907000
+#include <openssl/conf.h>
+#endif
#include "utils.h"
#include "connect.h"
@@ -176,6 +179,12 @@ ssl_init (void)
goto error;
}
+#if OPENSSL_VERSION_NUMBER >= 0x00907000
+ OPENSSL_load_builtin_modules();
+ ENGINE_load_builtin_engines();
+ CONF_modules_load_file(NULL, NULL,
+ CONF_MFLAGS_DEFAULT_SECTION|CONF_MFLAGS_IGNORE_MISSING_FILE);
+#endif
SSL_library_init ();
SSL_load_error_strings ();
SSLeay_add_all_algorithms ();
@@ -197,6 +206,21 @@ ssl_init (void)
case secure_protocol_tlsv1:
meth = TLSv1_client_method ();
break;
+#if OPENSSL_VERSION_NUMBER >= 0x10001000
+ case secure_protocol_tlsv1_1:
+ meth = TLSv1_1_client_method ();
+ break;
+ case secure_protocol_tlsv1_2:
+ meth = TLSv1_2_client_method ();
+ break;
+#else
+ case secure_protocol_tlsv1_1:
+ logprintf (LOG_NOTQUIET, _("Your OpenSSL version is too old to support TLSv1.1\n"));
+ goto error;
+ case secure_protocol_tlsv1_2:
+ logprintf (LOG_NOTQUIET, _("Your OpenSSL version is too old to support TLSv1.2\n"));
+ goto error;
+#endif
default:
abort ();
}
diff --git a/src/options.h b/src/options.h
index 326123a..575e647 100644
--- a/src/options.h
+++ b/src/options.h
@@ -200,7 +200,9 @@ struct options
secure_protocol_auto,
secure_protocol_sslv2,
secure_protocol_sslv3,
- secure_protocol_tlsv1
+ secure_protocol_tlsv1,
+ secure_protocol_tlsv1_1,
+ secure_protocol_tlsv1_2
} secure_protocol; /* type of secure protocol to use. */
bool check_cert; /* whether to validate the server's cert */
char *cert_file; /* external client certificate to use. */

27
SOURCES/wget-1.14-add_missing_options_doc.patch

@ -0,0 +1,27 @@ @@ -0,0 +1,27 @@
From 8dc52c6eaa1993d140a52bc0627e436efd9870d0 Mon Sep 17 00:00:00 2001
From: Giuseppe Scrivano <gscrivano@gnu.org>
Date: Sun, 28 Apr 2013 22:41:24 +0200
Subject: [PATCH] doc: add documentation for --accept-regex and --reject-regex

---
doc/wget.texi | 4 ++++
1 files changed, 4 insertions(+)

diff --git a/doc/wget.texi b/doc/wget.texi
index fed188a..039f700 100644
--- a/doc/wget.texi
+++ b/doc/wget.texi
@@ -2049,6 +2049,10 @@ any of the wildcard characters, @samp{*}, @samp{?}, @samp{[} or
@samp{]}, appear in an element of @var{acclist} or @var{rejlist},
it will be treated as a pattern, rather than a suffix.
+@item --accept-regex @var{urlregex}
+@itemx --reject-regex @var{urlregex}
+Specify a regular expression to accept or reject the complete URL.
+
@item -D @var{domain-list}
@itemx --domains=@var{domain-list}
Set domains to be followed. @var{domain-list} is a comma-separated list
--
1.8.1.4

22
SOURCES/wget-1.14-digest-auth-qop-segfault-fix.patch

@ -0,0 +1,22 @@ @@ -0,0 +1,22 @@
diff --git a/src/http.c b/src/http.c
index 5ee1c93..b45c404 100644
--- a/src/http.c
+++ b/src/http.c
@@ -3728,7 +3728,7 @@ digest_authentication_encode (const char *au, const char *user,
md5_finish_ctx (&ctx, hash);
dump_hash (a2buf, hash);
- if (!strcmp(qop,"auth"))
+ if (qop && !strcmp(qop,"auth"))
{
/* RFC 2617 Digest Access Authentication */
/* generate random hex string */
@@ -3776,7 +3776,7 @@ digest_authentication_encode (const char *au, const char *user,
res = xmalloc (res_size);
- if (!strcmp(qop,"auth"))
+ if (qop && !strcmp (qop, "auth"))
{
snprintf (res, res_size, "Digest "\
"username=\"%s\", realm=\"%s\", nonce=\"%s\", uri=\"%s\", response=\"%s\""\

61
SOURCES/wget-1.14-doc-missing-opts-and-fix-preserve-permissions.patch

@ -0,0 +1,61 @@ @@ -0,0 +1,61 @@
From c78caecbb4209ce2e36a587497cf1d6b350e513a Mon Sep 17 00:00:00 2001
From: Tomas Hozza <thozza@redhat.com>
Date: Thu, 11 Jul 2013 15:52:28 +0000
Subject: Document missing options and fix --preserve-permissions

Added documentation for --regex-type and --preserve-permissions
options.

Fixed --preserve-permissions to work properly also if downloading a
single file from FTP.

Signed-off-by: Tomas Hozza <thozza@redhat.com>
---
diff --git a/doc/wget.texi b/doc/wget.texi
index 710f0ac..5054382 100644
--- a/doc/wget.texi
+++ b/doc/wget.texi
@@ -1816,6 +1816,10 @@ in some rare firewall configurations, active FTP actually works when
passive FTP doesn't. If you suspect this to be the case, use this
option, or set @code{passive_ftp=off} in your init file.
+@cindex file permissions
+@item --preserve-permissions
+Preserve remote file permissions instead of permissions set by umask.
+
@cindex symbolic links, retrieving
@item --retr-symlinks
Usually, when retrieving @sc{ftp} directories recursively and a symbolic
@@ -2057,6 +2061,11 @@ it will be treated as a pattern, rather than a suffix.
@itemx --reject-regex @var{urlregex}
Specify a regular expression to accept or reject the complete URL.
+@item --regex-type @var{regextype}
+Specify the regular expression type. Possible types are @samp{posix} or
+@samp{pcre}. Note that to be able to use @samp{pcre} type, wget has to be
+compiled with libpcre support.
+
@item -D @var{domain-list}
@itemx --domains=@var{domain-list}
Set domains to be followed. @var{domain-list} is a comma-separated list
diff --git a/src/ftp.c b/src/ftp.c
index 9b3d81c..1fe2bac 100644
--- a/src/ftp.c
+++ b/src/ftp.c
@@ -2285,11 +2285,11 @@ ftp_loop (struct url *u, char **local_file, int *dt, struct url *proxy,
file_part = u->path;
ispattern = has_wildcards_p (file_part);
}
- if (ispattern || recursive || opt.timestamping)
+ if (ispattern || recursive || opt.timestamping || opt.preserve_perm)
{
/* ftp_retrieve_glob is a catch-all function that gets called
- if we need globbing, time-stamping or recursion. Its
- third argument is just what we really need. */
+ if we need globbing, time-stamping, recursion or preserve
+ permissions. Its third argument is just what we really need. */
res = ftp_retrieve_glob (u, &con,
ispattern ? GLOB_GLOBALL : GLOB_GETONE);
}
--
cgit v0.9.0.2

60
SOURCES/wget-1.14-document-backups.patch

@ -0,0 +1,60 @@ @@ -0,0 +1,60 @@
From 44ba49b31f4ea515f8a6ef2642a34c0fd2024b90 Mon Sep 17 00:00:00 2001
From: Giuseppe Scrivano <gscrivano@gnu.org>
Date: Tue, 9 Jul 2013 00:50:30 +0200
Subject: [PATCH] doc: document --backups

---
doc/wget.texi | 15 ++++++++++++---
src/main.c | 3 +++
2 files changed, 15 insertions(+), 3 deletions(-)

diff --git a/doc/wget.texi b/doc/wget.texi
index 5054382..7a1670e 100644
--- a/doc/wget.texi
+++ b/doc/wget.texi
@@ -630,6 +630,13 @@ Note that when @samp{-nc} is specified, files with the suffixes
@samp{.html} or @samp{.htm} will be loaded from the local disk and
parsed as if they had been retrieved from the Web.
+@cindex backing up files
+@item --backups=@var{backups}
+Before (over)writing a file, back up an existing file by adding a
+@samp{.1} suffix (@samp{_1} on VMS) to the file name. Such backup
+files are rotated to @samp{.2}, @samp{.3}, and so on, up to
+@var{backups} (and lost beyond that).
+
@cindex continue retrieval
@cindex incomplete downloads
@cindex resume download
@@ -2882,9 +2889,11 @@ enables it).
Enable/disable saving pre-converted files with the suffix
@samp{.orig}---the same as @samp{-K} (which enables it).
-@c @item backups = @var{number}
-@c #### Document me!
-@c
+@item backups = @var{number}
+Use up to @var{number} backups for a file. Backups are rotated by
+adding an incremental counter that starts at @samp{1}. The default is
+@samp{0}.
+
@item base = @var{string}
Consider relative @sc{url}s in input files (specified via the
@samp{input} command or the @samp{--input-file}/@samp{-i} option,
diff --git a/src/main.c b/src/main.c
index c895c4e..8ce0eb3 100644
--- a/src/main.c
+++ b/src/main.c
@@ -714,6 +714,9 @@ Recursive download:\n"),
N_("\
-k, --convert-links make links in downloaded HTML or CSS point to\n\
local files.\n"),
+ N_("\
+ --backups=N before writing file X, rotate up to N backup files.\n"),
+
#ifdef __VMS
N_("\
-K, --backup-converted before converting file X, back up as X_orig.\n"),
--
1.8.3.1

80
SOURCES/wget-1.14-fix-backups-to-work-as-documented.patch

@ -0,0 +1,80 @@ @@ -0,0 +1,80 @@
From c52bbad9e4bad1393a9d6ba37e600d388f5ab419 Mon Sep 17 00:00:00 2001
From: Giuseppe Scrivano <gscrivano@gnu.org>
Date: Wed, 10 Jul 2013 20:59:34 +0200
Subject: [PATCH] Make --backups work as documented

---
src/http.c | 6 ------
src/options.h | 2 +-
src/url.c | 3 ++-
src/url.h | 6 ++++++
4 files changed, 9 insertions(+), 8 deletions(-)

diff --git a/src/http.c b/src/http.c
index 9f274dc..b0c782b 100644
--- a/src/http.c
+++ b/src/http.c
@@ -1641,12 +1641,6 @@ read_response_body (struct http_stat *hs, int sock, FILE *fp, wgint contlen,
} while (0)
#endif /* def __VMS [else] */
-/* The flags that allow clobbering the file (opening with "wb").
- Defined here to avoid repetition later. #### This will require
- rework. */
-#define ALLOW_CLOBBER (opt.noclobber || opt.always_rest || opt.timestamping \
- || opt.dirstruct || opt.output_document)
-
/* Retrieve a document through HTTP protocol. It recognizes status
code, and correctly handles redirections. It closes the network
socket. If it receives an error from the functions below it, it
diff --git a/src/options.h b/src/options.h
index ed38617..0a10c9b 100644
--- a/src/options.h
+++ b/src/options.h
@@ -166,7 +166,7 @@ struct options
bool timestamping; /* Whether to use time-stamping. */
bool backup_converted; /* Do we save pre-converted files as *.orig? */
- bool backups; /* Are numeric backups made? */
+ int backups; /* Are numeric backups made? */
char *useragent; /* User-Agent string, which can be set
to something other than Wget. */
diff --git a/src/url.c b/src/url.c
index 5e2b9a3..bf9d697 100644
--- a/src/url.c
+++ b/src/url.c
@@ -1669,11 +1669,12 @@ url_file_name (const struct url *u, char *replaced_filename)
2) Retrieval with regetting.
3) Timestamping is used.
4) Hierarchy is built.
+ 5) Backups are specified.
The exception is the case when file does exist and is a
directory (see `mkalldirs' for explanation). */
- if ((opt.noclobber || opt.always_rest || opt.timestamping || opt.dirstruct)
+ if (ALLOW_CLOBBER
&& !(file_exists_p (fname) && !file_non_directory_p (fname)))
{
unique = fname;
diff --git a/src/url.h b/src/url.h
index b7f4366..cd3782b 100644
--- a/src/url.h
+++ b/src/url.h
@@ -47,6 +47,12 @@ as that of the covered work. */
#define DEFAULT_FTP_PORT 21
#define DEFAULT_HTTPS_PORT 443
+/* The flags that allow clobbering the file (opening with "wb").
+ Defined here to avoid repetition later. #### This will require
+ rework. */
+#define ALLOW_CLOBBER (opt.noclobber || opt.always_rest || opt.timestamping \
+ || opt.dirstruct || opt.output_document || opt.backups > 0)
+
/* Specifies how, or whether, user auth information should be included
* in URLs regenerated from URL parse structures. */
enum url_auth_mode {
--
1.8.3.1

29
SOURCES/wget-1.14-fix-double-free-of-iri-orig_url.patch

@ -0,0 +1,29 @@ @@ -0,0 +1,29 @@
From bdf2764457bef7c33be289b889ddf6df91773296 Mon Sep 17 00:00:00 2001
From: Tomas Hozza <thozza@redhat.com>
Date: Wed, 10 Jul 2013 13:23:37 +0200
Subject: [PATCH] Set iri->orig_url to NULL after free.

Set iri->orig_url to NULL after free to prevent double
free in retrieve_url() and iri_free() when using IRI
and downloading site that redirects itself.

Signed-off-by: Tomas Hozza <thozza@redhat.com>
---
src/retr.c | 1 +
1 file changed, 1 insertion(+)

diff --git a/src/retr.c b/src/retr.c
index 6204839..66624dc 100644
--- a/src/retr.c
+++ b/src/retr.c
@@ -838,6 +838,7 @@ retrieve_url (struct url * orig_parsed, const char *origurl, char **file,
iri->utf8_encode = opt.enable_iri;
set_content_encoding (iri, NULL);
xfree_null (iri->orig_url);
+ iri->orig_url = NULL;
/* Now, see if this new location makes sense. */
newloc_parsed = url_parse (mynewloc, &up_error_code, iri, true);
--
1.8.3.1

164
SOURCES/wget-1.14-fix-synchronization-in-Test-proxied-https-auth.patch

@ -0,0 +1,164 @@ @@ -0,0 +1,164 @@
From 082e7194605e99f0e50f8909fcaf10adee747cc8 Mon Sep 17 00:00:00 2001
From: Tomas Hozza <thozza@redhat.com>
Date: Fri, 5 May 2017 13:46:11 +0200
Subject: [PATCH] Fix client/server synchronization in
Test-proxied-https-auth.px test

Combination of upstream commits vithout adding support for Valgrind:
3eff3ad69a46364475e1f4abdf9412cfa87e3d6c
2303793a626158627bdb2ac255e0f58697682b24

Signed-off-by: Tomas Hozza <thozza@redhat.com>
---
tests/Test-proxied-https-auth.px | 82 +++++++++++++++++++++++-----------------
1 file changed, 48 insertions(+), 34 deletions(-)

diff --git a/tests/Test-proxied-https-auth.px b/tests/Test-proxied-https-auth.px
index 1de5357..e1a6c44 100755
--- a/tests/Test-proxied-https-auth.px
+++ b/tests/Test-proxied-https-auth.px
@@ -1,4 +1,6 @@
#!/usr/bin/env perl
+# Simulate a tunneling proxy to a HTTPS URL that needs authentication.
+# Use two connections (Connection: close)
use strict;
use warnings;
@@ -39,31 +41,33 @@ sub get_request {
}
sub do_server {
- my $alrm = alarm 10;
-
+ my ($synch_callback) = @_;
my $s = $SOCKET;
my $conn;
my $rqst;
my $rspn;
+
+ my %options = (
+ SSL_server => 1,
+ SSL_passwd_cb => sub { return "Hello"; });
+ $options{SSL_cert_file} = $cert_path if ($cert_path);
+ $options{SSL_key_file} = $key_path if ($key_path);
+ my @options = %options;
+
+ # sync with the parent
+ $synch_callback->();
+
+ # Simulate a HTTPS proxy server with tunneling.
+
for my $expect_inner_auth (0, 1) {
$conn = $s->accept;
$rqst = $conn->get_request;
-
- # TODO: expect no auth the first time, request it, expect it the second
- # time.
-
die "Method not CONNECT\n" if ($rqst->method ne 'CONNECT');
$rspn = HTTP::Response->new(200, 'OK');
$conn->send_response($rspn);
- my %options = (
- SSL_server => 1,
- SSL_passwd_cb => sub { return "Hello"; });
-
- $options{SSL_cert_file} = $cert_path if ($cert_path);
- $options{SSL_key_file} = $key_path if ($key_path);
-
- my @options = %options;
+ # Now switch from plain to SSL (for simulating a transparent tunnel
+ # to an HTTPS server).
$conn = IO::Socket::SSL->new_from_fd($conn->fileno, @options)
or die "Couldn't initiate SSL";
@@ -74,14 +78,10 @@ sub do_server {
unless ($expect_inner_auth) {
die "Early proxied auth\n" if $rqst->header('Authorization');
- # TODO: handle non-persistent connection here.
$rspn = HTTP::Response->new(401, 'Unauthorized', [
'WWW-Authenticate' => 'Basic realm="gondor"',
Connection => 'close'
]);
- $rspn->protocol('HTTP/1.0');
- print $rspn->as_string;
- print $conn $rspn->as_string;
} else {
die "No proxied auth\n" unless $rqst->header('Authorization');
@@ -89,41 +89,55 @@ sub do_server {
'Content-Type' => 'text/plain',
'Connection' => 'close',
], "foobarbaz\n");
- $rspn->protocol('HTTP/1.0');
- print "=====\n";
- print $rspn->as_string;
- print "\n=====\n";
- print $conn $rspn->as_string;
}
+
+ $rspn->protocol('HTTP/1.0');
+ print STDERR "=====\n";
+ print STDERR $rspn->as_string;
+ print STDERR "\n=====\n";
+ print $conn $rspn->as_string;
+
$conn->close;
}
+
undef $conn;
undef $s;
- alarm $alrm;
}
sub fork_server {
- my $pid = fork;
- die "Couldn't fork" if ($pid < 0);
- return $pid if $pid;
+ pipe(FROM_CHILD, TO_PARENT) or die "Cannot create pipe!";
+ select((select(TO_PARENT), $| = 1)[0]);
+
+ my $pid = fork();
+ if ($pid < 0) {
+ die "Cannot fork";
+ } elsif ($pid == 0) {
+ # child
+ close FROM_CHILD;
+ do_server(sub { print TO_PARENT "SYNC\n"; close TO_PARENT });
+ exit 0;
+ } else {
+ # parent
+ close TO_PARENT;
+ chomp(my $line = <FROM_CHILD>);
+ close FROM_CHILD;
+ }
- &do_server;
- exit;
+ return $pid;
}
-system ('rm -f needs-auth.txt');
+unlink "needs-auth.txt";
my $pid = &fork_server;
-sleep 1;
my $cmdline = $WgetTest::WGETPATH . " --user=fiddle-dee-dee"
. " --password=Dodgson -e https_proxy=localhost:{{port}}"
. " --no-check-certificate"
. " https://no.such.domain/needs-auth.txt";
$cmdline =~ s/{{port}}/$SOCKET->sockport()/e;
-my $code = system($cmdline);
-system ('rm -f needs-auth.txt');
+my $code = system($cmdline . " 2>&1") >> 8;
+unlink "needs-auth.txt";
warn "Got code: $code\n" if $code;
kill ('TERM', $pid);
-exit ($code >> 8);
+exit ($code != 0);
--
2.7.4

55
SOURCES/wget-1.14-manpage-tex5.patch

@ -0,0 +1,55 @@ @@ -0,0 +1,55 @@
From a2a34ad8e09117041761fa96830f289aa6e67042 Mon Sep 17 00:00:00 2001
From: Tomas Hozza <thozza@redhat.com>
Date: Fri, 22 Feb 2013 12:29:37 +0100
Subject: [PATCH] Fix @itemx issue when building doc

@itemx should be used ONLY for second and subsequent item(s).

Signed-off-by: Tomas Hozza <thozza@redhat.com>
---
doc/wget.texi | 8 ++++----
1 file changed, 4 insertions(+), 4 deletions(-)

diff --git a/doc/wget.texi b/doc/wget.texi
index c1fc82f..3768156 100644
--- a/doc/wget.texi
+++ b/doc/wget.texi
@@ -876,7 +876,7 @@ recommendation to block many unrelated users from a web site due to the
actions of one.
@cindex proxy
-@itemx --no-proxy
+@item --no-proxy
Don't use proxies, even if the appropriate @code{*_proxy} environment
variable is defined.
@@ -977,7 +977,7 @@ are outside the range of @sc{ascii} characters (that is, greater than
whose encoding does not match the one used locally.
@cindex IPv6
-@itemx -4
+@item -4
@itemx --inet4-only
@itemx -6
@itemx --inet6-only
@@ -3094,7 +3094,7 @@ display properly---the same as @samp{-p}.
Change setting of passive @sc{ftp}, equivalent to the
@samp{--passive-ftp} option.
-@itemx password = @var{string}
+@item password = @var{string}
Specify password @var{string} for both @sc{ftp} and @sc{http} file retrieval.
This command can be overridden using the @samp{ftp_password} and
@samp{http_password} command for @sc{ftp} and @sc{http} respectively.
@@ -3605,7 +3605,7 @@ In addition to the environment variables, proxy location and settings
may be specified from within Wget itself.
@table @samp
-@itemx --no-proxy
+@item --no-proxy
@itemx proxy = on/off
This option and the corresponding command may be used to suppress the
use of proxy, even if the appropriate environment variables are set.
--
1.8.1.2

26
SOURCES/wget-1.14-rh1147572.patch

@ -0,0 +1,26 @@ @@ -0,0 +1,26 @@
From 798f554773baf1adca376500ca120a992e6d7492 Mon Sep 17 00:00:00 2001
From: Tim Ruehsen <tim.ruehsen@gmx.de>
Date: Tue, 28 Aug 2012 16:38:21 +0200
Subject: [PATCH] remove -nv from --report-speed in doc/wget.texi

---
doc/wget.texi | 3 +--
2 files changed, 5 insertions(+), 2 deletions(-)

diff --git a/doc/wget.texi b/doc/wget.texi
index 7efdc72..400debe 100644
--- a/doc/wget.texi
+++ b/doc/wget.texi
@@ -479,8 +479,7 @@ Turn off verbose without being completely quiet (use @samp{-q} for
that), which means that error messages and basic information still get
printed.
-@item -nv
-@itemx --report-speed=@var{type}
+@item --report-speed=@var{type}
Output bandwidth as @var{type}. The only accepted value is @samp{bits}.
@cindex input-file
--
1.9.3

30
SOURCES/wget-1.14-rh1203384.patch

@ -0,0 +1,30 @@ @@ -0,0 +1,30 @@
From aed7d4163a9e2083d294a9471e1347ab13d6f2ab Mon Sep 17 00:00:00 2001
From: Pavel Mateja <pavel@netsafe.cz>
Date: Sat, 2 Nov 2013 11:27:58 +0100
Subject: [PATCH] http: specify Host when CONNECT is used.

---
src/http.c | 7 +++----
2 files changed, 7 insertions(+), 4 deletions(-)

diff --git a/src/http.c b/src/http.c
index dbfcdfb..8917fa5 100644
--- a/src/http.c
+++ b/src/http.c
@@ -2013,10 +2013,9 @@ gethttp (struct url *u, struct http_stat *hs, int *dt, struct url *proxy,
the regular request below. */
proxyauth = NULL;
}
- /* Examples in rfc2817 use the Host header in CONNECT
- requests. I don't see how that gains anything, given
- that the contents of Host would be exactly the same as
- the contents of CONNECT. */
+ request_set_header (connreq, "Host",
+ aprintf ("%s:%d", u->host, u->port),
+ rel_value);
write_error = request_send (connreq, sock, 0);
request_free (connreq);
--
2.1.0

32
SOURCES/wget-1.14-set_sock_to_-1_if_no_persistent_conn.patch

@ -0,0 +1,32 @@ @@ -0,0 +1,32 @@
From 8760123cee87e07a276b8b13ef48ada3a490ad47 Mon Sep 17 00:00:00 2001
From: Tomas Hozza <thozza@redhat.com>
Date: Thu, 11 Jul 2013 11:22:43 +0000
Subject: Set sock variable to -1 if no persistent conn exists

Wget should set sock variable to -1 if no persistent
connection exists. Function persistent_available_p()
tests persistent connection but if test_socket_open()
fails it closes the socket but will not set sock variable
to -1. After returning from persistent_available_p()
it is possible that sock has still value of already
closed connection.

Signed-off-by: Tomas Hozza <thozza@redhat.com>
---
diff --git a/src/http.c b/src/http.c
index 669f0fe..a693355 100644
--- a/src/http.c
+++ b/src/http.c
@@ -1983,6 +1983,10 @@ gethttp (struct url *u, struct http_stat *hs, int *dt, struct url *proxy,
exec_name, quote (relevant->host));
return HOSTERR;
}
+ else if (sock != -1)
+ {
+ sock = -1;
+ }
}
if (sock < 0)
--
cgit v0.9.0.2

105
SOURCES/wget-1.14-sslreadtimeout.patch

@ -0,0 +1,105 @@ @@ -0,0 +1,105 @@
diff -up wget-1.14/src/openssl.c.ssltimeout wget-1.14/src/openssl.c
--- wget-1.14/src/openssl.c.ssltimeout 2012-08-09 14:30:14.987964706 +0200
+++ wget-1.14/src/openssl.c 2012-08-09 14:44:05.467660741 +0200
@@ -256,19 +256,42 @@ struct openssl_transport_context {
char *last_error; /* last error printed with openssl_errstr */
};
-static int
-openssl_read (int fd, char *buf, int bufsize, void *arg)
-{
- int ret;
- struct openssl_transport_context *ctx = arg;
+struct openssl_read_args {
+ int fd;
+ struct openssl_transport_context *ctx;
+ char *buf;
+ int bufsize;
+ int retval;
+};
+
+static void openssl_read_callback(void *arg) {
+ struct openssl_read_args *args = (struct openssl_read_args *) arg;
+ struct openssl_transport_context *ctx = args->ctx;
SSL *conn = ctx->conn;
+ char *buf = args->buf;
+ int bufsize = args->bufsize;
+ int ret;
+
do
ret = SSL_read (conn, buf, bufsize);
- while (ret == -1
- && SSL_get_error (conn, ret) == SSL_ERROR_SYSCALL
+ while (ret == -1 && SSL_get_error (conn, ret) == SSL_ERROR_SYSCALL
&& errno == EINTR);
+ args->retval = ret;
+}
- return ret;
+static int
+openssl_read (int fd, char *buf, int bufsize, void *arg)
+{
+ struct openssl_read_args args;
+ args.fd = fd;
+ args.buf = buf;
+ args.bufsize = bufsize;
+ args.ctx = (struct openssl_transport_context*) arg;
+
+ if (run_with_timeout(opt.read_timeout, openssl_read_callback, &args)) {
+ return -1;
+ }
+ return args.retval;
}
static int
@@ -386,6 +409,18 @@ static struct transport_implementation o
openssl_peek, openssl_errstr, openssl_close
};
+struct scwt_context {
+ SSL *ssl;
+ int result;
+};
+
+static void
+ssl_connect_with_timeout_callback(void *arg)
+{
+ struct scwt_context *ctx = (struct scwt_context *)arg;
+ ctx->result = SSL_connect(ctx->ssl);
+}
+
/* Perform the SSL handshake on file descriptor FD, which is assumed
to be connected to an SSL server. The SSL handle provided by
OpenSSL is registered with the file descriptor FD using
@@ -398,6 +433,7 @@ bool
ssl_connect_wget (int fd, const char *hostname)
{
SSL *conn;
+ struct scwt_context scwt_ctx;
struct openssl_transport_context *ctx;
DEBUGP (("Initiating SSL handshake.\n"));
@@ -425,7 +461,14 @@ ssl_connect_wget (int fd, const char *ho
if (!SSL_set_fd (conn, FD_TO_SOCKET (fd)))
goto error;
SSL_set_connect_state (conn);
- if (SSL_connect (conn) <= 0 || conn->state != SSL_ST_OK)
+
+ scwt_ctx.ssl = conn;
+ if (run_with_timeout(opt.read_timeout, ssl_connect_with_timeout_callback,
+ &scwt_ctx)) {
+ DEBUGP (("SSL handshake timed out.\n"));
+ goto timeout;
+ }
+ if (scwt_ctx.result <= 0 || conn->state != SSL_ST_OK)
goto error;
ctx = xnew0 (struct openssl_transport_context);
@@ -441,6 +484,7 @@ ssl_connect_wget (int fd, const char *ho
error:
DEBUGP (("SSL handshake failed.\n"));
print_errors ();
+ timeout:
if (conn)
SSL_free (conn);
return false;

154
SOURCES/wget-1.14-support-non-ASCII-characters.patch

@ -0,0 +1,154 @@ @@ -0,0 +1,154 @@
From 0a33fa22c597234ab133f63127b4a5e00cf048b9 Mon Sep 17 00:00:00 2001
From: Tomas Hozza <thozza@redhat.com>
Date: Mon, 20 Jun 2016 12:10:38 +0200
Subject: [PATCH] Support non-ASCII characters

Upstream commit 59b920874daa565a1323ffa1e756e80493190686

Signed-off-by: Tomas Hozza <thozza@redhat.com>
---
src/url.c | 87 +++++++++++++++++++++++++++++++++++++++++++++++++--
tests/Test-ftp-iri.px | 4 +--
2 files changed, 87 insertions(+), 4 deletions(-)

diff --git a/src/url.c b/src/url.c
index 6bca719..d0d9e27 100644
--- a/src/url.c
+++ b/src/url.c
@@ -42,6 +42,11 @@ as that of the covered work. */
#include "url.h"
#include "host.h" /* for is_valid_ipv6_address */
+#if HAVE_ICONV
+#include <iconv.h>
+#include <langinfo.h>
+#endif
+
#ifdef __VMS
#include "vms.h"
#endif /* def __VMS */
@@ -1335,8 +1340,8 @@ UWC, C, C, C, C, C, C, C, /* NUL SOH STX ETX EOT ENQ ACK BEL */
0, 0, 0, 0, 0, 0, 0, 0, /* p q r s t u v w */
0, 0, 0, 0, W, 0, 0, C, /* x y z { | } ~ DEL */
- C, C, C, C, C, C, C, C, C, C, C, C, C, C, C, C, /* 128-143 */
- C, C, C, C, C, C, C, C, C, C, C, C, C, C, C, C, /* 144-159 */
+ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, /* 128-143 */
+ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, /* 144-159 */
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
@@ -1456,6 +1461,82 @@ append_uri_pathel (const char *b, const char *e, bool escaped,
TAIL_INCR (dest, outlen);
}
+static char *
+convert_fname (const char *fname)
+{
+ char *converted_fname = (char *)fname;
+#if HAVE_ICONV
+ const char *from_encoding = opt.encoding_remote;
+ const char *to_encoding = opt.locale;
+ iconv_t cd;
+ size_t len, done, inlen, outlen;
+ char *s;
+ const char *orig_fname = fname;;
+
+ /* Defaults for remote and local encodings. */
+ if (!from_encoding)
+ from_encoding = "UTF-8";
+ if (!to_encoding)
+ to_encoding = nl_langinfo (CODESET);
+
+ cd = iconv_open (to_encoding, from_encoding);
+ if (cd == (iconv_t)(-1))
+ logprintf (LOG_VERBOSE, _("Conversion from %s to %s isn't supported\n"),
+ quote (from_encoding), quote (to_encoding));
+ else
+ {
+ inlen = strlen (fname);
+ len = outlen = inlen * 2;
+ converted_fname = s = xmalloc (outlen + 1);
+ done = 0;
+
+ for (;;)
+ {
+ if (iconv (cd, &fname, &inlen, &s, &outlen) != (size_t)(-1)
+ && iconv (cd, NULL, NULL, &s, &outlen) != (size_t)(-1))
+ {
+ *(converted_fname + len - outlen - done) = '\0';
+ iconv_close(cd);
+ DEBUGP (("Converted file name '%s' (%s) -> '%s' (%s)\n",
+ orig_fname, from_encoding, converted_fname, to_encoding));
+ xfree (orig_fname);
+ return converted_fname;
+ }
+
+ /* Incomplete or invalid multibyte sequence */
+ if (errno == EINVAL || errno == EILSEQ)
+ {
+ logprintf (LOG_VERBOSE,
+ _("Incomplete or invalid multibyte sequence encountered\n"));
+ xfree (converted_fname);
+ converted_fname = (char *)orig_fname;
+ break;
+ }
+ else if (errno == E2BIG) /* Output buffer full */
+ {
+ done = len;
+ len = outlen = done + inlen * 2;
+ converted_fname = xrealloc (converted_fname, outlen + 1);
+ s = converted_fname + done;
+ }
+ else /* Weird, we got an unspecified error */
+ {
+ logprintf (LOG_VERBOSE, _("Unhandled errno %d\n"), errno);
+ xfree (converted_fname);
+ converted_fname = (char *)orig_fname;
+ break;
+ }
+ }
+ DEBUGP (("Failed to convert file name '%s' (%s) -> '?' (%s)\n",
+ orig_fname, from_encoding, to_encoding));
+ }
+
+ iconv_close(cd);
+#endif
+
+ return converted_fname;
+}
+
/* Append to DEST the directory structure that corresponds the
directory part of URL's path. For example, if the URL is
http://server/dir1/dir2/file, this appends "/dir1/dir2".
@@ -1582,6 +1663,8 @@ url_file_name (const struct url *u, char *replaced_filename)
fname = fnres.base;
+ fname = convert_fname (fname);
+
/* Check the cases in which the unique extensions are not used:
1) Clobbering is turned off (-nc).
2) Retrieval with regetting.
diff --git a/tests/Test-ftp-iri.px b/tests/Test-ftp-iri.px
index a4b7fe1..24ac467 100755
--- a/tests/Test-ftp-iri.px
+++ b/tests/Test-ftp-iri.px
@@ -26,12 +26,12 @@ my %urls = (
},
);
-my $cmdline = $WgetTest::WGETPATH . " --local-encoding=iso-8859-1 -S ftp://localhost:{{port}}/fran${ccedilla_l1}ais.txt";
+my $cmdline = $WgetTest::WGETPATH . " --local-encoding=iso-8859-1 --remote-encoding=utf-8 -S ftp://localhost:{{port}}/fran${ccedilla_l1}ais.txt";
my $expected_error_code = 0;
my %expected_downloaded_files = (
- "fran${ccedilla_u8}ais.txt" => {
+ "fran${ccedilla_l1}ais.txt" => {
content => $francais,
},
);
--
2.5.5

25
SOURCES/wget-1.14-texi2pod_error_perl518.patch

@ -0,0 +1,25 @@ @@ -0,0 +1,25 @@
From 7f43748544f26008d0dd337704f02a6ed3200aaf Mon Sep 17 00:00:00 2001
From: Dave Reisner <dreisner@archlinux.org>
Date: Mon, 17 Jun 2013 23:31:46 +0530
Subject: [PATCH] Fix error in texi2pod intriduced with Perl 5.18

---
doc/texi2pod.pl | 2 +-
1 files changed, 1 insertions(+), 1 deletion(-)

diff --git a/doc/texi2pod.pl b/doc/texi2pod.pl
index 86c4b18..9db6de1 100755
--- a/doc/texi2pod.pl
+++ b/doc/texi2pod.pl
@@ -291,7 +291,7 @@ while(<$inf>) {
if (defined $1) {
my $thing = $1;
if ($ic =~ /\@asis/) {
- $_ = "\n=item $thing\n";
+ $_ = "\n=item C<$thing>\n";
} else {
# Entity escapes prevent munging by the <> processing below.
$_ = "\n=item $ic\&LT;$thing\&GT;\n";
--
1.8.1.4

11
SOURCES/wget-rh-modified.patch

@ -0,0 +1,11 @@ @@ -0,0 +1,11 @@
--- configure~ 2011-09-13 03:15:38.000000000 -0500
+++ configure 2011-12-16 09:19:34.574773958 -0600
@@ -561,7 +561,7 @@
PACKAGE_NAME='wget'
PACKAGE_TARNAME='wget'
PACKAGE_VERSION='1.14'
-PACKAGE_STRING='wget 1.14'
+PACKAGE_STRING='wget 1.14 (Red Hat modified)'
PACKAGE_BUGREPORT='bug-wget@gnu.org'
PACKAGE_URL=''

590
SPECS/wget.spec

@ -0,0 +1,590 @@ @@ -0,0 +1,590 @@
Summary: A utility for retrieving files using the HTTP or FTP protocols
Name: wget
Version: 1.14
Release: 18%{?dist}
License: GPLv3+
Group: Applications/Internet
Url: http://www.gnu.org/software/wget/
Source: ftp://ftp.gnu.org/gnu/wget/wget-%{version}.tar.xz

Patch1: wget-rh-modified.patch
Patch2: wget-1.12-path.patch
Patch3: wget-1.14-sslreadtimeout.patch
Patch4: wget-1.14-manpage-tex5.patch
Patch5: wget-1.14-add_missing_options_doc.patch
Patch6: wget-1.14-texi2pod_error_perl518.patch
Patch7: wget-1.14-fix-double-free-of-iri-orig_url.patch
Patch8: wget-1.14-Fix-deadcode-and-possible-NULL-use.patch
Patch9: wget-1.14-doc-missing-opts-and-fix-preserve-permissions.patch
Patch10: wget-1.14-set_sock_to_-1_if_no_persistent_conn.patch
Patch11: wget-1.14-document-backups.patch
Patch12: wget-1.14-fix-backups-to-work-as-documented.patch
Patch13: wget-1.14-CVE-2014-4877.patch
Patch14: wget-1.14-rh1203384.patch
Patch15: wget-1.14-rh1147572.patch
Patch16: wget-1.14-CVE-2016-4971.patch
# needed because fix for CVE-2016-4971 changes default behavior
# and the file is not saved in correct encoding. This caused the
# Test-ftp-iri-fallback test to fail. This additional change makes
# Test-ftp-iri-fallback test pass again.
Patch17: wget-1.14-support-non-ASCII-characters.patch
Patch18: wget-1.14-add-openssl-tlsv11-tlsv12-support.patch
# Fix for randomly failing unit test
# combination of upstream commits without the support for Valgrind
# commit 3eff3ad69a46364475e1f4abdf9412cfa87e3d6c
# commit 2303793a626158627bdb2ac255e0f58697682b24
Patch19: wget-1.14-fix-synchronization-in-Test-proxied-https-auth.patch
Patch20: wget-1.14-CVE-2017-13089.patch
Patch21: wget-1.14-CVE-2017-13090.patch
# Partial backport without setting the default algorithm
# http://git.savannah.gnu.org/cgit/wget.git/commit/?id=e9cc8b2f7c4678b832ad56f7119bba86a8db08ef
Patch22: wget-1.14-digest-auth-qop-segfault-fix.patch
# https://git.savannah.gnu.org/cgit/wget.git/commit/?id=1fc9c95ec144499e69dc8ec76dbe07799d7d82cd
Patch23: wget-1.14-CVE-2018-0494.patch

Provides: webclient
Provides: bundled(gnulib)
Requires(post): /sbin/install-info
Requires(preun): /sbin/install-info
BuildRequires: openssl-devel, pkgconfig, texinfo, gettext, autoconf, libidn-devel, libuuid-devel, perl-podlators
# dependencies for the test suite
BuildRequires: perl-libwww-perl
BuildRoot: %{_tmppath}/%{name}-%{version}-%{release}-root-%(%{__id_u} -n)

%description
GNU Wget is a file retrieval utility which can use either the HTTP or
FTP protocols. Wget features include the ability to work in the
background while you are logged out, recursive retrieval of
directories, file name wildcard matching, remote file timestamp
storage and comparison, use of Rest with FTP servers and Range with
HTTP servers to retrieve files over slow or unstable connections,
support for Proxy servers, and configurability.

%prep
%setup -q
%patch1 -p0
%patch2 -p1
%patch3 -p1 -b .sslreadtimeout
%patch4 -p1
%patch5 -p1
%patch6 -p1
%patch7 -p1
%patch8 -p1
%patch9 -p1
%patch10 -p1
%patch11 -p1
%patch12 -p1
%patch13 -p1
%patch14 -p1
%patch15 -p1
%patch16 -p1
%patch17 -p1
%patch18 -p1 -b .tls11_tls12
%patch19 -p1 -b .test_synch_fix
%patch20 -p1 -b .CVE-2017-13089
%patch21 -p1 -b .CVE-2017-13090
%patch22 -p1 -b .digest-auth-segfault
%patch23 -p1 -b .CVE-2018-0494

%build
if pkg-config openssl ; then
CPPFLAGS=`pkg-config --cflags openssl`; export CPPFLAGS
LDFLAGS=`pkg-config --libs openssl`; export LDFLAGS
fi
%configure --with-ssl=openssl --enable-largefile --enable-opie --enable-digest --enable-ntlm --enable-nls --enable-ipv6 --disable-rpath
make %{?_smp_mflags}

%install
rm -rf $RPM_BUILD_ROOT
make install DESTDIR=$RPM_BUILD_ROOT CFLAGS="$RPM_OPT_FLAGS"
rm -f $RPM_BUILD_ROOT/%{_infodir}/dir

%find_lang %{name}

%post
/sbin/install-info %{_infodir}/wget.info.gz %{_infodir}/dir || :

%preun
if [ "$1" = 0 ]; then
/sbin/install-info --delete %{_infodir}/wget.info.gz %{_infodir}/dir || :
fi

%clean
rm -rf $RPM_BUILD_ROOT

%check
make check

%files -f %{name}.lang
%defattr(-,root,root)
%doc AUTHORS MAILING-LIST NEWS README COPYING doc/sample.wgetrc
%config(noreplace) %{_sysconfdir}/wgetrc
%{_mandir}/man1/wget.*
%{_bindir}/wget
%{_infodir}/*

%changelog
* Wed May 09 2018 Tomas Hozza <thozza@redhat.com> - 1.14-18
- Fix CVE-2018-0494 (#1576106)

* Mon Apr 23 2018 Tomas Hozza <thozza@redhat.com> - 1.14-17
- Fix segfault when Digest Authentication header is missing 'qop' part (#1545310)

* Tue Oct 24 2017 Tomas Hozza <thozza@redhat.com> - 1.14-16
- Fixed various security flaws (CVE-2017-13089, CVE-2017-13090)

* Fri May 05 2017 Tomas Hozza <thozza@redhat.com> - 1.14-15
- Added TLSv1_1 and TLSv1_2 as secure-protocol values to help (#1439811)
- Fixed synchronization in randomly failing unit test Test-proxied-https-auth (#1448440)

* Wed Apr 12 2017 Tomas Hozza <thozza@redhat.com> - 1.14-14
- TLS v1.1 and v1.2 can now be specified with --secure-protocol option (#1439811)

* Mon Jun 20 2016 Tomas Hozza <thozza@redhat.com> - 1.14-13
- Fix CVE-2016-4971 (#1345778)
- Added support for non-ASCII URLs (Related: CVE-2016-4971)

* Mon Mar 21 2016 Tomas Hozza <thozza@redhat.com> - 1.14-12
- Fix wget to include Host header on CONNECT as required by HTTP 1.1 (#1203384)
- Run internal test suite during build (#1295846)
- Fix -nv being documented as synonym for two options (#1147572)

* Fri Oct 24 2014 Tomas Hozza <thozza@redhat.com> - 1.14-11
- Fix CVE-2014-4877 wget: FTP symlink arbitrary filesystem access (#1156136)

* Fri Jan 24 2014 Daniel Mach <dmach@redhat.com> - 1.14-10
- Mass rebuild 2014-01-24

* Fri Dec 27 2013 Daniel Mach <dmach@redhat.com> - 1.14-9
- Mass rebuild 2013-12-27

* Mon Jul 15 2013 Tomas Hozza <thozza@redhat.com> - 1.14-8
- Fix deadcode and possible use of NULL in vprintf (#913153)
- Add documentation for --regex-type and --preserve-permissions
- Fix --preserve-permissions to work as documented (and expected)
- Fix bug when authenticating using user:password@url syntax (#912358)
- Document and fix --backups option

* Wed Jul 10 2013 Tomas Hozza <thozza@redhat.com> - 1.14-7
- Fix double free of iri->orig_url (#981778)

* Mon Jun 24 2013 Tomas Hozza <thozza@redhat.com> - 1.14-6
- add missing options accept-regex and reject-regex to man page
- fix errors in texi2pod introduced in Perl-5.18

* Fri Feb 22 2013 Tomas Hozza <thozza@redhat.com> - 1.14-5
- Added BuildRequires: perl-podlators for pod2man
- Patched manpage to silent new Tex errors
- Resolves: (#914571)

* Fri Feb 15 2013 Fedora Release Engineering <rel-eng@lists.fedoraproject.org> - 1.14-4
- Rebuilt for https://fedoraproject.org/wiki/Fedora_19_Mass_Rebuild

* Thu Oct 11 2012 Tomas Hozza <thozza@redhat.com> 1.14-3
- Added libuuid-devel to BuildRequires to use libuuid functions
in "src/warc.c" functions (#865421)

* Wed Oct 10 2012 Tomas Hozza <thozza@redhat.com> 1.14-2
- Added libidn-devel to BuildRequires to support IDN domains (#680394)

* Thu Aug 09 2012 Karsten Hopp <karsten@redhat.com> 1.14-1
- Update to wget-1.14

* Sun Jul 22 2012 Fedora Release Engineering <rel-eng@lists.fedoraproject.org> - 1.13.4-5
- Rebuilt for https://fedoraproject.org/wiki/Fedora_18_Mass_Rebuild

* Tue May 29 2012 Karsten Hopp <karsten@redhat.com> 1.13.4-4
- fix timeout if http server doesn't answer to SSL handshake (#860727)

* Tue May 15 2012 Karsten Hopp <karsten@redhat.com> 1.13.4-3
- add virtual provides per https://fedoraproject.org/wiki/Packaging:No_Bundled_Libraries

* Sat Jan 14 2012 Fedora Release Engineering <rel-eng@lists.fedoraproject.org> - 1.13.4-2
- Rebuilt for https://fedoraproject.org/wiki/Fedora_17_Mass_Rebuild

* Fri Dec 16 2011 Jon Ciesla <limburgher@gmail.com> - 1.13.4-1
- New upstream, BZ 730286.
- Modified path patch.
- subjectAltNames patch upstreamed.
- Specified openssl at config time.

* Thu Jun 23 2011 Volker Fröhlich <volker27@gmx.at> - 1.12-4
- Applied patch to accept subjectAltNames in X509 certificates (#674186)
- New URL (#658969)

* Mon Feb 07 2011 Fedora Release Engineering <rel-eng@lists.fedoraproject.org> - 1.12-3
- Rebuilt for https://fedoraproject.org/wiki/Fedora_15_Mass_Rebuild

* Wed Nov 18 2009 Karsten Hopp <karsten@redhat.com> 1.12-2
- don't provide /usr/share/info/dir

* Tue Nov 17 2009 Karsten Hopp <karsten@redhat.com> 1.12-1
- update to wget-1.12
- fixes CVE-2009-3490 wget: incorrect verification of SSL certificate
with NUL in name

* Fri Aug 21 2009 Tomas Mraz <tmraz@redhat.com> - 1.11.4-5
- rebuilt with new openssl

* Mon Jul 27 2009 Fedora Release Engineering <rel-eng@lists.fedoraproject.org> - 1.11.4-4
- Rebuilt for https://fedoraproject.org/wiki/Fedora_12_Mass_Rebuild

* Wed Feb 25 2009 Fedora Release Engineering <rel-eng@lists.fedoraproject.org> - 1.11.4-3
- Rebuilt for https://fedoraproject.org/wiki/Fedora_11_Mass_Rebuild

* Sun Jan 18 2009 Tomas Mraz <tmraz@redhat.com> 1.11.4-2
- rebuild with new openssl

* Wed Aug 13 2008 Karsten Hopp <karsten@redhat.com> 1.11.4-1
- update

* Wed Jun 04 2008 Karsten Hopp <karsten@redhat.com> 1.11.3-1
- wget-1.11.3, downgrades the combination of the -N and -O options
to a warning instead of an error

* Fri May 09 2008 Karsten Hopp <karsten@redhat.com> 1.11.2-1
- wget-1.11.2, fixes #179962

* Mon Mar 31 2008 Karsten Hopp <karsten@redhat.com> 1.11.1-1
- update to bugfix release 1.11.1, fixes p.e. #433606

* Tue Feb 19 2008 Fedora Release Engineering <rel-eng@fedoraproject.org> - 1.11-2
- Autorebuild for GCC 4.3

* Tue Dec 04 2007 Karsten Hopp <karsten@redhat.com> 1.10.2-17
- rebuild to pick up new openssl SONAME

* Mon Aug 27 2007 Karsten Hopp <karsten@redhat.com> 1.10.2-16
- fix license tag
- rebuild

* Mon Feb 12 2007 Karsten Hopp <karsten@redhat.com> 1.10.2-15
- fix discarding of expired cookies
- escape non-printable characters
- drop to11 patch for now (#223754, #227853, #227498)

* Mon Feb 05 2007 Karsten Hopp <karsten@redhat.com> 1.10.2-14
- shut up rpmlint, even though xx isn't a macro

* Mon Feb 05 2007 Karsten Hopp <karsten@redhat.com> 1.10.2-13
- merge review changes (#226538)
- use version/release/... in buildroot tag
- remove BR perl
- use SMP flags
- use make install instead of %%makeinstall
- include copy of license
- use Requires(post)/Requires(preun)
- use optflags
- remove trailing dot from summary
- change tabs to spaces

* Thu Jan 18 2007 Karsten Hopp <karsten@redhat.com> 1.10.2-12
- don't abort (un)install scriptlets when _excludedocs is set (Ville Skyttä)

* Wed Jan 10 2007 Karsten Hopp <karsten@redhat.com> 1.10.2-11
- add fix for CVE-2006-6719

* Fri Dec 08 2006 Karsten Hopp <karsten@redhat.com> 1.10.2-10
- fix repeated downloads (Tomas Heinrich, #186195)

* Thu Dec 07 2006 Karsten Hopp <karsten@redhat.com> 1.10.2-9
- add distflag, rebuild

* Thu Dec 07 2006 Karsten Hopp <karsten@redhat.com> 1.10.2-8
- Resolves: #218211
fix double free corruption

* Sun Oct 01 2006 Jesse Keating <jkeating@redhat.com> - 1.10.2-7
- rebuilt for unwind info generation, broken in gcc-4.1.1-21

* Mon Sep 25 2006 Karsten Hopp <karsten@redhat.de> 1.10.2-6
- fix resumed downloads (#205723)

* Wed Jul 12 2006 Jesse Keating <jkeating@redhat.com> - 1.10.2-5.1
- rebuild

* Thu Jun 29 2006 Karsten Hopp <karsten@redhat.de> 1.10.2-5
- updated german translations from Robert Scheck

* Tue Jun 27 2006 Karsten Hopp <karsten@redhat.de> 1.10.2-4
- upstream patches

* Fri Feb 10 2006 Jesse Keating <jkeating@redhat.com> - 1.10.2-3.2.1
- bump again for double-long bug on ppc(64)

* Tue Feb 07 2006 Jesse Keating <jkeating@redhat.com> - 1.10.2-3.2
- rebuilt for new gcc4.1 snapshot and glibc changes

* Fri Dec 09 2005 Jesse Keating <jkeating@redhat.com>
- rebuilt

* Thu Nov 10 2005 Tomas Mraz <tmraz@redhat.com> 1.10.2-3
- rebuilt against new openssl

* Tue Oct 25 2005 Karsten Hopp <karsten@redhat.de> 1.10.2-2
- use %%{_sysconfdir} (#171555)

* Sat Oct 15 2005 Florian La Roche <laroche@redhat.com>
- 1.10.2

* Thu Sep 08 2005 Karsten Hopp <karsten@redhat.de> 1.10.1-7
- fix builtin help of --load-cookies / --save-cookies (#165408)

* Wed Sep 07 2005 Karsten Hopp <karsten@redhat.de> 1.10.1-6
- convert changelog to UTF-8 (#159585)

* Mon Sep 05 2005 Karsten Hopp <karsten@redhat.de> 1.10.1-5
- update
- drop patches which are already in the upstream sources

* Wed Jul 13 2005 Karsten Hopp <karsten@redhat.de> 1.10-5
- update german translation

* Mon Jul 11 2005 Karsten Hopp <karsten@redhat.de> 1.10-4
- update german translation (Robert Scheck)

* Tue Jul 05 2005 Karsten Hopp <karsten@redhat.de> 1.10-3
- fix minor documentation bug
- fix --no-cookies crash

* Mon Jul 04 2005 Karsten Hopp <karsten@redhat.de> 1.10-2
- update to wget-1.10
- drop passive-ftp patch, already in 1.10
- drop CVS patch
- drop LFS patch, similar fix in 1.10
- drop protdir patch, similar fix in 1.10
- drop actime patch, already in 1.10

* Wed Mar 02 2005 Karsten Hopp <karsten@redhat.de> 1.9.1-22
- build with gcc-4

* Wed Feb 02 2005 Karsten Hopp <karsten@redhat.de> 1.9.1-21
- remove old copy of the manpage (#146875, #135597)
- fix garbage in manpage (#117519)

* Tue Feb 01 2005 Karsten Hopp <karsten@redhat.de> 1.9.1-20
- texi2pod doesn't handle texinfo xref's. rewrite some lines so that
the man page doesn't have incomplete sentences anymore (#140470)

* Mon Jan 31 2005 Karsten Hopp <karsten@redhat.de> 1.9.1-19
- Don't set actime to access time of the remote file or tmpwatch might
remove the file again (#146440). Set it to the current time instead.
timestamping checks only modtime, so this should be ok.

* Thu Jan 20 2005 Karsten Hopp <karsten@redhat.de> 1.9.1-18
- add support for --protocol-directories option as documented
in the man page (Ville Skyttä, #145571)

* Wed Sep 29 2004 Karsten Hopp <karsten@redhat.de> 1.9.1-17
- additional LFS patch from Leonid Petrov to fix file lengths in
http downloads

* Thu Sep 16 2004 Karsten Hopp <karsten@redhat.de> 1.9.1-16
- more fixes

* Tue Sep 14 2004 Karsten Hopp <karsten@redhat.de> 1.9.1-15
- added strtol fix from Leonid Petrov, reenable LFS

* Tue Sep 14 2004 Karsten Hopp <karsten@redhat.de> 1.9.1-14
- buildrequires gettext (#132519)

* Wed Sep 01 2004 Karsten Hopp <karsten@redhat.de> 1.9.1-13
- disable LFS patch for now, it breaks normal downloads (123524#c15)

* Tue Aug 31 2004 Karsten Hopp <karsten@redhat.de> 1.9.1-12
- move largefile stuff inside the configure script, it didn't
get appended to CFLAGS

* Tue Aug 31 2004 Karsten Hopp <karsten@redhat.de> 1.9.1-11
- rebuild

* Tue Aug 31 2004 Karsten Hopp <karsten@redhat.de> 1.9.1-10
- fix patch

* Sun Aug 29 2004 Karsten Hopp <karsten@redhat.de> 1.9.1-9
- more cleanups of the manpage (#117519)

* Fri Aug 27 2004 Karsten Hopp <karsten@redhat.de> 1.9.1-8
- rebuild

* Fri Aug 27 2004 Karsten Hopp <karsten@redhat.de> 1.9.1-7
- clean up manpage (#117519)
- buildrequire texinfo (#123780)
- LFS patch, based on wget-LFS-20040630.patch from Leonid Petrov
(#123524, #124628, #115348)

* Tue Jun 15 2004 Elliot Lee <sopwith@redhat.com>
- rebuilt

* Thu Mar 11 2004 Karsten Hopp <karsten@redhat.de> 1.9.1-3
- fix documentation (#117517)

* Fri Feb 13 2004 Elliot Lee <sopwith@redhat.com>
- rebuilt

* Fri Nov 28 2003 Karsten Hopp <karsten@redhat.de> 1.9.1-3
- update to -stable CVS
- document the passive ftp default

* Fri Nov 28 2003 Karsten Hopp <karsten@redhat.de> 1.9.1-2
- add patch from -stable CVS

* Fri Nov 28 2003 Karsten Hopp <karsten@redhat.de> 1.9.1-1
- update to 1.9.1
- remove obsolete patches

* Mon Aug 04 2003 Karsten Hopp <karsten@redhat.de> 1.8.2-15.3
- fix variable usage

* Tue Jul 22 2003 Nalin Dahyabhai <nalin@redhat.com> 1.8.2-15.2
- rebuild

* Wed Jun 25 2003 Karsten Hopp <karsten@redhat.de> 1.8.2-15.1
- rebuilt

* Wed Jun 25 2003 Karsten Hopp <karsten@redhat.de> 1.8.2-15
- default to passive-ftp (#97996)

* Wed Jun 04 2003 Elliot Lee <sopwith@redhat.com>
- rebuilt

* Wed Jun 04 2003 Karsten Hopp <karsten@redhat.de> 1.8.2-13
- rebuild

* Wed Jun 04 2003 Karsten Hopp <karsten@redhat.de> 1.8.2-12
- merge debian patch for long URLs
- cleanup filename patch

* Sun May 11 2003 Karsten Hopp <karsten@redhat.de> 1.8.2-11
- rebuild

* Sun May 11 2003 Karsten Hopp <karsten@redhat.de> 1.8.2-10
- upstream fix off-by-one error

* Wed Jan 22 2003 Tim Powers <timp@redhat.com>
- rebuilt

* Tue Jan 7 2003 Nalin Dahyabhai <nalin@redhat.com> 1.8.2-8
- rebuild

* Fri Dec 13 2002 Nalin Dahyabhai <nalin@redhat.com>
- use openssl pkg-config data, if present
- don't bomb out when building with newer openssl

* Thu Dec 12 2002 Tim Powers <timp@redhat.com> 1.8.2-7
- rebuild on all arches

* Tue Nov 19 2002 Tim Powers <timp@redhat.com>
- rebuild on all arches

* Fri Oct 4 2002 Karsten Hopp <karsten@redhat.de> 1.8.2-5
- fix directory traversal bug

* Wed Jul 24 2002 Trond Eivind Glomsrød <teg@redhat.com> 1.8.2-3
- Don't segfault when downloading URLs A-B-A (A-A-B worked) #49859

* Fri Jun 21 2002 Tim Powers <timp@redhat.com>
- automated rebuild

* Wed May 29 2002 Florian La Roche <Florian.LaRoche@redhat.de>
- update to 1.8.2 (bug-fix release)

* Thu May 23 2002 Tim Powers <timp@redhat.com>
- automated rebuild

* Mon Apr 29 2002 Florian La Roche <Florian.LaRoche@redhat.de>
- remove s390 patch, not needed anymore

* Wed Feb 27 2002 Trond Eivind Glomsrød <teg@redhat.com> 1.8.1-4
- Rebuild

* Wed Jan 09 2002 Tim Powers <timp@redhat.com>
- automated rebuild

* Fri Dec 28 2001 Florian La Roche <Florian.LaRoche@redhat.de>
- add hack to not link against libmd5, even if available

* Fri Dec 28 2001 Florian La Roche <Florian.LaRoche@redhat.de>
- update to 1.8.1

* Thu Dec 13 2001 Florian La Roche <Florian.LaRoche@redhat.de>
- update to 1.8
- also include md5global to get it compile

* Sun Nov 18 2001 Florian La Roche <Florian.LaRoche@redhat.de>
- update to 1.7.1

* Wed Sep 5 2001 Phil Knirsch <phil@redhat.de> 1.7-3
- Added va_args patch required for S390.

* Mon Sep 3 2001 Trond Eivind Glomsrød <teg@redhat.com> 1.7-2
- Configure with ssl support (duh - #53116)
- s/Copyright/License/

* Wed Jun 6 2001 Trond Eivind Glomsrød <teg@redhat.com>
- 1.7
- Require perl for building (to get man pages)
- Don't include the Japanese po file, it's now included
- Use %%{_tmppath}
- no patches necessary
- Make /etc/wgetrc noreplace
- More docs

* Tue Jan 30 2001 Trond Eivind Glomsrød <teg@redhat.com>
- Norwegian isn't a iso-8859-2 locale, neither is Danish.
This fixes #15025.
- langify

* Sat Jan 6 2001 Bill Nottingham <notting@redhat.com>
- escape %%xx characters before fnmatch (#23475, patch from alane@geeksrus.net)

* Fri Jan 5 2001 Bill Nottingham <notting@redhat.com>
- update to 1.6, fix patches accordingly (#23412)
- fix symlink patch (#23411)

* Mon Dec 18 2000 Yukihiro Nakai <ynakai@redhat.com>
- Add Japanese and Korean Resources

* Tue Aug 1 2000 Bill Nottingham <notting@redhat.com>
- setlocale for LC_CTYPE too, or else all the translations think their
characters are unprintable.

* Thu Jul 13 2000 Prospector <bugzilla@redhat.com>
- automatic rebuild

* Sun Jun 11 2000 Bill Nottingham <notting@redhat.com>
- build in new environment

* Mon Jun 5 2000 Bernhard Rosenkraenzer <bero@redhat.com>
- FHS compliance

* Thu Feb 3 2000 Bill Nottingham <notting@redhat.com>
- handle compressed man pages

* Thu Aug 26 1999 Jeff Johnson <jbj@redhat.com>
- don't permit chmod 777 on symlinks (#4725).

* Sun Mar 21 1999 Cristian Gafton <gafton@redhat.com>
- auto rebuild in the new build environment (release 4)

* Fri Dec 18 1998 Bill Nottingham <notting@redhat.com>
- build for 6.0 tree
- add Provides

* Sat Oct 10 1998 Cristian Gafton <gafton@redhat.com>
- strip binaries
- version 1.5.3

* Sat Jun 27 1998 Jeff Johnson <jbj@redhat.com>
- updated to 1.5.2

* Thu Apr 30 1998 Cristian Gafton <gafton@redhat.com>
- modified group to Applications/Networking

* Wed Apr 22 1998 Cristian Gafton <gafton@redhat.com>
- upgraded to 1.5.0
- they removed the man page from the distribution (Duh!) and I added it back
from 1.4.5. Hey, removing the man page is DUMB!

* Fri Nov 14 1997 Cristian Gafton <gafton@redhat.com>
- first build against glibc
Loading…
Cancel
Save