Mailing List Archive

(no subject)
I used to be a Network Administrator for a company with 300 seats, plus as a software developer I had admin rights to all the company's MS SQL Server databases. Unclogging the network and resolving database gridlock were 2 areas I excelled in there.

If you'd like to give me developer access, I could take a look around and try to see what keeps slowing us down. I guess it's a lot of different things, many of which Brion has already identified.

But the fact that restarting the machine always speeds thing up again indicates the probable presence of one or more as-yet unidentified problems.

Of course, as a "developer" I would be ever scrupulous about the "rules" -- I would absolutely not use developer rights to, say, win a POV battle or unilaterally ban an obnoxious user.

Ed Poor
"Opinions and proposals expressed in this letter are mine personally, and are unrelated to any aims or policies of my employer."
Re: (no subject) [ In reply to ]
Poor, Edmund W wrote:

>I used to be a Network Administrator for a company with 300 seats, plus as a software developer I had admin rights to all the company's MS SQL Server databases. Unclogging the network and resolving database gridlock were 2 areas I excelled in there.
>
>If you'd like to give me developer access, I could take a look around and try to see what keeps slowing us down. I guess it's a lot of different things, many of which Brion has already identified.
>
Done. For the site, that is. You'll need Jimbo to give you ssh access to
the server.

Magnus
Re: (no subject) [ In reply to ]
This sounds interesting. I know nothing about the technical side either
and would not consider competing for podium space; I don't even know
whether the conference itself would interest me.

If enough people are interested this could serve as the excuse for a
get-together of Wikipedians, particularly the ones on the West coast.
For me it's a leisurely 6-hour drive from Vancouver. There was talk
about such things last year that never materialized.

Ec

Geoff Burling wrote:

>This year, O'Reilly will be presenting the Open Source Convention here
>is Portland, Oregon. Since travelling to it is a negligible task
>(it's in downtown, & I usually commute further to work than there), I've
>been considering making a presentation about Wikipedia at this year's
>convention.
>
>However, I can think of about a dozen people who know far more about
>the goings-on at Wikipedia than me (e.g., Brion, Magnus, Mav, Ed Poor --
>practically everyone on this mailling list), & I wanted to make sure that
>
>1. No one is planning on making a similar presentation;
>
>2. No one minds that I attempt to give what O'Reilly calls a "Session
>Presentation" (it's a 90 minute talk) on Wikipedia.
>
>Obviously, I'm not very familiar with the technical side of Wikipedia --
>although the challenge of keeping one of the largest Wikiwiki sites up
>& running is something that any talk needs to address. My intended emphasis
>for this talk would be more on the social side -- as well as addressing
>the perennial question, ``Can a distributed group of people write a
>useable encyclopedia?" (FWIW, my honest answer to that question is yes.)
>
>More information about making presentations at OSCON is available at
>http://conferences.oreillynet.com/os2004/
>
Re: (no subject) [ In reply to ]
karl@karllong.com wrote:
> Warning: $wgProxyKey is insecure Can't find a writable temp
> directory for the XHTML template. Check that the TMP environment
> variable points to a writable directory, or that the default temp
> dir (/tmp) exists and is writable.

So have you done this?

-- brion vibber (brion @ pobox.com)
Re: (no subject) [ In reply to ]
I've got no idea what I need to unblock for this to work :-( Try again
until you get a different IP address?

I've cc'ed this to the technical list for their consideration. This
user is on AOL and is blocked from creating an account, but we don't
have enough information for me to work out which autoblock is causing
the problem and undo it.


- d.



On 20/02/06, WARDCOLIN6@aol.com <WARDCOLIN6@aol.com> wrote:

> When I tried to create an account in order to logon, I was told that I had already opened ten accounts and was blocked from creating any more.
_______________________________________________
Wikitech-l mailing list
Wikitech-l@wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
David Gerard wrote:
> I've got no idea what I need to unblock for this to work :-( Try again
> until you get a different IP address?
>
> I've cc'ed this to the technical list for their consideration. This
> user is on AOL and is blocked from creating an account, but we don't
> have enough information for me to work out which autoblock is causing
> the problem and undo it.

That's the rate limiter, not the autoblocker.

Wait a day and try again.

-- brion vibber (brion @ pobox.com)
Re: (no subject) [ In reply to ]
Sorry for double mail; Can we imagine a search motor that can look into :
1 - The mailing lists where users can ask questions
2 - The main forums
3 - The main site with documentation (mediawiki.org and sites such as
semantic-mediawiki)
4 - ...

Google can does 2 and 3, I don't know about 1.

There is also layers that goes more work than a simple personalised google.

2009/7/3 Thibaut DEVERAUX <thibaut.deveraux@gmail.com>

> What about a search motor wich can find information into thoose mailings ?
> This is not a "solution" but this is a way to make the users able to find
> information.
>
>
>
> Original messages :
>
> Ever met a developer who likes writing doc? :)
>
> > and a lot of the docs have never been read by a developer. That being
> > said, using FlaggedRevs we might be able to deliver more solid docs
> > on MW.org by flagging docs at like two levels. One could be like a basic
> > "has been looked over for glaring errors and basic readability" and
> > a second could be "has been thoroughly reviewed and is considered
> > the doc on the given subject."
>
> Perhaps we could start by getting developers to thoroughly review
> documentation?
>
> You're proposing a technical solution to a people problem. The problem
> is not that the site can't display the fact that a developer vouches
> for the quality of documentation. The problem is that there are no
> processes for getting developers to review documentation and vouch for
> it.
>
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
I'm not sure what you're proposing here. I think the
primary problem isn't "where are the docs?" but rather
"how good are the docs?"

A great example came up in #mediawiki a few days
ago. A user asked about using pretty urls relative to
their document root (ie: example.com/Page). A good
number of us gave the typical line about it being
broken and to not use it. Upon further discussion, it
became more clear that this idea was not broken due
to huge issues in Mediawiki itself, but rather that's
what the docs said so we just treated it as fact.

That's obviously not a position we wish to take on our
documentation. It hinders proper support of the
software and wastes everyone's time.

-Chad

On Jul 3, 2009 9:11 AM, "Thibaut DEVERAUX" <thibaut.deveraux@gmail.com>
wrote:

Sorry for double mail; Can we imagine a search motor that can look into :
1 - The mailing lists where users can ask questions
2 - The main forums
3 - The main site with documentation (mediawiki.org and sites such as
semantic-mediawiki)
4 - ...

Google can does 2 and 3, I don't know about 1.

There is also layers that goes more work than a simple personalised google.

2009/7/3 Thibaut DEVERAUX <thibaut.deveraux@gmail.com>

> What about a search motor wich can find information into thoose mailings ?
> This is not a "solut...
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
Chad schrieb:
> I'm not sure what you're proposing here. I think the
> primary problem isn't "where are the docs?" but rather
> "how good are the docs?"
>
> A great example came up in #mediawiki a few days
> ago. A user asked about using pretty urls relative to
> their document root (ie: example.com/Page). A good
> number of us gave the typical line about it being
> broken and to not use it. Upon further discussion, it
> became more clear that this idea was not broken due
> to huge issues in Mediawiki itself, but rather that's
> what the docs said so we just treated it as fact.

The idea is broken not due to technical problems, but due to conceptual
problems, malely namespace pollution. mixing two different things (files and
wiki pages) in the same namespace is a bad thing. it can be done, but it often
leads to problems. This is why the documentation recommends against it. But the
documentation also tells you how to do it.

-- daniel

_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

>> A while back I ran clamav against all 'executable' looking external links and found one nasty file. It would be really nice if the mechanism that updates externalinks table spat out a running log of external link additions and removals that we could hook an ongoing scanner into.
>
>
> Simple matter of coding, then? :-)

This sort of thing would hekp with some of or external antispam tools.
Currently we rely on parsing edits manually to see when links are added
- - some realtime and machine-readable format for notifications of such
edits would be great.

- -Mike
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.9 (GNU/Linux)

iEYEARECAAYFAkqi5QgACgkQst0AR/DaKHuxVQCglYidNlbvwkEmFELrK9IJm4Oy
gvMAn05SP3jmptvGif0GGA5xM518Mq/t
=BEi7
-----END PGP SIGNATURE-----

_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
Mike.lifeguard wrote:
>> Simple matter of coding, then? :-)
>
> This sort of thing would hekp with some of or external antispam tools.
> Currently we rely on parsing edits manually to see when links are added
> - some realtime and machine-readable format for notifications of such
> edits would be great.
>
> -Mike

File a bug?
This probably depends on bug 17450 and should block 16599.


_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
On Thu, Jan 28, 2010 at 5:02 PM, Tei <oscar.vives@gmail.com> wrote:

> On 28 January 2010 15:06, 李琴 <qli@ica.stc.sh.cn> wrote:
> > Hi all,
> > I have built a LocalWiki. Now I want the data of it to keep
> consistent
> > with the
> > Wikipedia and one work I should do is to get the data of update from
> > Wikipedia.
> > I get the URLs through analyzing the RSS
> > (
> http://zh.wikipedia.org/w/index.php?title=Special:%E6%9C%80%E8%BF%91%E6%9B%B4%E6%94%B9&feed=rss
> )
> > and get all HTML content of the edit box by analyzing
> > these URLs after opening an URL and clicking the ’edit this page’.
> ....
> > That’s because I visit it too frequently and my IP address is prohibited
> > or the network is too slow?
>
> 李琴 well.. thats webscrapping, that is a poor tecnique, one with lots
> of errors that generate lots of trafic.
>
> One thing a robot must do is read and follow the
> http://zh.wikipedia.org/robots.txt file ( probably you sould read it
> too)
> As a general rule of Internet, a "rude" robot will be banned by the
> site admins.
>
> It would be a good idea to anounce your bot as a bot in the user_agent
> string . Good bot beavior is one that read a website like a human. I
> don't know, like 10 request minute?. I don't know about this
> "Wikipedia" site rules about it.
>
> What you are suffering could be automatic or manual throttling, since
> is detected a abusive number of request from your IP.
>
> "Wikipedia" seems to provide fulldumps of his wiki, but are unusable
> for you, since are giganteous :-/, trying to rebuilt wikipedia on your
> PC with a snapshot would be like summoning Tchulu in a teapot. But.. I
> don't know, maybe the zh version is smaller, or your resources
> powerfull enough. One feels that what you have built has a severe
> overload (wastage of resources) and there must be better ways to do
> it...
>
Indeed there are. What you need:
1) the Wikimedia IRC live feed - last time I've looked at it, it was at
irc://irc.wikimedia.org/ and then each project had its own channel.
2) A PHP IRC bot framework - Net_SmartIRC is well-written and easy to get
started with
3) the page source you can EASILY get either in rendered form
http://zh.wikipedia.org/w/index.php?title=TITLE&action=render or in raw form
http://zh.wikipedia.org/w/index.php?title=TITLE&action=raw (this is page
source).

Marco

--
VMSoft GbR
Nabburger Str. 15
81737 München
Geschäftsführer: Marco Schuster, Volker Hemmert
http://vmsoft-gbr.de
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
On Wed, Jul 31, 2013 at 11:23 AM, Tyler Romeo <tylerromeo@gmail.com> wrote:

> Hey all,
>
> Mozilla made an announcement yesterday about a new framework called Minion:
>
> http://blog.mozilla.org/security/2013/07/30/introducing-minion/
> https://github.com/mozilla/minion
>
> It's an automated security testing framework for use in testing web
> applications. I'm currently looking into how to use it. Would there be any
> interest in setting up such a framework for automated security testing of
> MediaWiki?
>

Looks interesting! Sounds like something for the QA list:
https://lists.wikimedia.org/mailman/listinfo/qa
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
On Wed, Jul 31, 2013 at 11:23 AM, Tyler Romeo <tylerromeo@gmail.com> wrote:
> Hey all,
>
> Mozilla made an announcement yesterday about a new framework called Minion:
>
> http://blog.mozilla.org/security/2013/07/30/introducing-minion/
> https://github.com/mozilla/minion
>
> It's an automated security testing framework for use in testing web
> applications. I'm currently looking into how to use it. Would there be any
> interest in setting up such a framework for automated security testing of
> MediaWiki?

I'm definitely interested in seeing if we can leverage something like
this. I'm not sure where it would fit alongside our current automated
testing, but I think it would be valuable to at least take a closer
look. And it's nice to see they're supporting ZAP and skipfish,
although unless they allow for more detailed configurations, both take
ages to completely scan a MediaWiki install.

If you get it running, please share your experience.

> *-- *
> *Tyler Romeo*
> Stevens Institute of Technology, Class of 2016
> Major in Computer Science
> www.whizkidztech.com | tylerromeo@gmail.com
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
OK, so after a bit of trouble I managed to get it working on my Vagrant
instance.

Here's a brief summary of what I learned:
* It uses a MongoDB backend with Python and Flask as a front-end
* There are plugins that implement certain tests (e.g., nmap, skipfish)
* Plans are combinations of plugins, basically a test plan
* Sites are added into groups, and are then assigned plans
* Finally, you run plans on the frontend and they're run by a celery job
queue

From the looks of it, I don't think this would be particularly useful for
individual developers, because many of the tests require a full TLS setup
and whatnot.

What might be useful is to have a security instance running MediaWiki with
a similar setup to the actual en-wiki, and then have Minion running on an
instance and have it run the tests that way. Unfortunately, I don't know
how we would manage users (since it doesn't have LDAP integration) or when
we would run these tests (I'd imagine there wouldn't be a need to run them
on every change).

Thoughts?

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
www.whizkidztech.com | tylerromeo@gmail.com


On Wed, Jul 31, 2013 at 2:39 PM, Chris Steipp <csteipp@wikimedia.org> wrote:

> On Wed, Jul 31, 2013 at 11:23 AM, Tyler Romeo <tylerromeo@gmail.com>
> wrote:
> > Hey all,
> >
> > Mozilla made an announcement yesterday about a new framework called
> Minion:
> >
> > http://blog.mozilla.org/security/2013/07/30/introducing-minion/
> > https://github.com/mozilla/minion
> >
> > It's an automated security testing framework for use in testing web
> > applications. I'm currently looking into how to use it. Would there be
> any
> > interest in setting up such a framework for automated security testing of
> > MediaWiki?
>
> I'm definitely interested in seeing if we can leverage something like
> this. I'm not sure where it would fit alongside our current automated
> testing, but I think it would be valuable to at least take a closer
> look. And it's nice to see they're supporting ZAP and skipfish,
> although unless they allow for more detailed configurations, both take
> ages to completely scan a MediaWiki install.
>
> If you get it running, please share your experience.
>
> > *-- *
> > *Tyler Romeo*
> > Stevens Institute of Technology, Class of 2016
> > Major in Computer Science
> > www.whizkidztech.com | tylerromeo@gmail.com
> > _______________________________________________
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
<quote name="Tyler Romeo" date="2013-07-31" time="16:21:50 -0400">
> What might be useful is to have a security instance running MediaWiki with
> a similar setup to the actual en-wiki, and then have Minion running on an
> instance and have it run the tests that way. Unfortunately, I don't know
> how we would manage users (since it doesn't have LDAP integration) or when
> we would run these tests (I'd imagine there wouldn't be a need to run them
> on every change).

Tyler: mind reporting this as an enhancement bug in deployment-prep?
Include things like what is needed to get it working etc.

Might be something we could get running against the beta cluster,
perhaps.

Greg

--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |

_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
On Wed, Jul 31, 2013 at 5:00 PM, Greg Grossmeier <greg@wikimedia.org> wrote:

> Tyler: mind reporting this as an enhancement bug in deployment-prep?
> Include things like what is needed to get it working etc.
>
> Might be something we could get running against the beta cluster,
> perhaps.
>

Sure thing: https://bugzilla.wikimedia.org/show_bug.cgi?id=52354

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
www.whizkidztech.com | tylerromeo@gmail.com
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
> For external uses like XML dumps integrating the compression
> strategy into LZMA would however be very attractive. This would also
> benefit other users of LZMA compression like HBase.

For dumps or other uses, 7za -mx=3 / xz -3 is your best bet.

That has a 4 MB buffer, compression ratios within 15-25% of
current 7zip (or histzip), and goes at 30MB/s on my box,
which is still 8x faster than the status quo (going by a 1GB
benchmark).

Trying to get quick-and-dirty long-range matching into LZMA isn't
feasible for me personally and there may be inherent technical
difficulties. Still, I left a note on the 7-Zip boards as folks
suggested; feel free to add anything there:
https://sourceforge.net/p/sevenzip/discussion/45797/thread/73ed3ad7/

Thanks for the reply,
Randall



On Tue, Jan 21, 2014 at 2:19 PM, Randall Farmer <randall@wawd.com> wrote:

> > For external uses like XML dumps integrating the compression
> > strategy into LZMA would however be very attractive. This would also
> > benefit other users of LZMA compression like HBase.
>
> For dumps or other uses, 7za -mx=3 / xz -3 is your best bet.
>
> That has a 4 MB buffer, compression ratios within 15-25% of
> current 7zip (or histzip), and goes at 30MB/s on my box,
> which is still 8x faster than the status quo (going by a 1GB
> benchmark).
>
> Re: trying to get long-range matching into LZMA, first, I
> couldn't confidently hack on liblzma. Second, Igor might
> not want to do anything as niche-specific as this (but who
> knows!). Third, even with a faster matching strategy, the
> LZMA *format* seems to require some intricate stuff (range
> coding) that be a blocker to getting the ideal speeds
> (honestly not sure).
>
> In any case, I left a note on the 7-Zip boards as folks have
> suggested: https://sourceforge.net/p/sevenzip/discussion/45797/thread/73ed3ad7/
>
> Thanks for the reply,
> Randall
>
>
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
You should resend this email from a different address and with a subject,
GMail thinks it's spam, so almost no one will ever see it.

-- Matma Rex


2014-07-10 8:47 GMT+02:00 Thomas Mulhall <thomasmulhall410@yahoo.com>:

> Hi we are upgrading jquery cookie from an early alpha version of 1.1 to
> 1.2. Please start upgrading your code to be compatible with jquery cookie
> 1.2. There is just one deprecations to notice and that is $.cookie('foo',
> null) is now deprecated. And replace it with Adding $.removeCookie('foo')
> for deleting a cookie. We are slowly upgrading to version 1.4.1 but one
> step at a time because it is. A ja our change and removes a lot of things.
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
Hoi,
I use GMAIL and I see it plenty fine.
Thanks,
GerardM


On 10 July 2014 11:51, Bartosz Dziewoński <matma.rex@gmail.com> wrote:

> You should resend this email from a different address and with a subject,
> GMail thinks it's spam, so almost no one will ever see it.
>
> -- Matma Rex
>
>
> 2014-07-10 8:47 GMT+02:00 Thomas Mulhall <thomasmulhall410@yahoo.com>:
>
> > Hi we are upgrading jquery cookie from an early alpha version of 1.1 to
> > 1.2. Please start upgrading your code to be compatible with jquery cookie
> > 1.2. There is just one deprecations to notice and that is
> $.cookie('foo',
> > null) is now deprecated. And replace it with Adding $.removeCookie('foo')
> > for deleting a cookie. We are slowly upgrading to version 1.4.1 but one
> > step at a time because it is. A ja our change and removes a lot of
> things.
> > _______________________________________________
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
On Thu, Jul 10, 2014 at 5:59 AM, Gerard Meijssen <gerard.meijssen@gmail.com>
wrote:

> I use GMAIL and I see it plenty fine.


I also use Gmail, and it says the only reason it wasn't sent to spam was
because I have a filter sending all wikitech emails to me inbox.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
Hoi,
no filters for me..
Thanks,
GerardM


On 10 July 2014 13:39, Tyler Romeo <tylerromeo@gmail.com> wrote:

> On Thu, Jul 10, 2014 at 5:59 AM, Gerard Meijssen <
> gerard.meijssen@gmail.com>
> wrote:
>
> > I use GMAIL and I see it plenty fine.
>
>
> I also use Gmail, and it says the only reason it wasn't sent to spam was
> because I have a filter sending all wikitech emails to me inbox.
>
> *-- *
> *Tyler Romeo*
> Stevens Institute of Technology, Class of 2016
> Major in Computer Science
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
On Thu, Jul 10, 2014 at 2:32 PM, Gerard Meijssen <gerard.meijssen@gmail.com>
wrote:

> Hoi,
> no filters for me..
> Thanks,

GerardM
>

Congratulations, we are all happy to hear that. Just FYI: The subjectless
e-mail fell into the spam folder for me at Gmail, too. And I guess we could
all agree sending e-mails with no subject to a large mailing list is just
not a great idea anyway. So… do we need to spend any more e-mails debating
this?

Thanks.
-- [[cs:User:Mormegil | Petr Kadlec]]
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
Welcome to the list Angela! Let us know if there's anything we can help
with. Also you may want to join the IRC channels #wikimedia-dev and
#wikimedia-tech. Cheers!

Ryan Kaldari

On Thu, Feb 19, 2015 at 11:22 AM, Angela lum neh <lumneh.angela385@gmail.com
> wrote:

> Hello everyone, am Angela. Am happy to be part of this mailing list.
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: (no subject) [ In reply to ]
Le 08/10/2015 21:04, Purodha Blissenbach a écrit :
> In a simpe message text typo fix:
> https://gerrit.wikimedia.org/r/#/c/243332/
> in the job
> mwext-testextension-zend
> the test
> PHPUnit
> failed with these messages:
>
> + php phpunit.php --with-phpunitdir
> /srv/deployment/integration/phpunit/vendor/phpunit/phpunit --log-junit
> /mnt/jenkins-workspace/workspace/mwext-testextension-zend/log/junit-phpunit-allexts.xml
> --testsuite extensions
> 18:43:29 You MUST install Composer dependenciesRecording test results
> 18:43:29 ERROR: Publisher 'Publish JUnit test result report' failed: No
> test report files were found. Configuration error?
> 18:43:29 [PostBuildScript] - Execution post build scripts.
> 18:43:29
>
> I have no clue what do to fix it.

Hello,

The tests have been broke with the previous human made change:
https://gerrit.wikimedia.org/r/#/c/136152/

That one was not passing tests and has been force merged. End result is
the extension tests are now broken.

The error message is missing a newline and should read as:

You MUST install Composer dependencies


From composer.json that means:

guzzlehttp/guzzle: ~3.8


The Jenkins job mwext-testextension-zend does not install composer
dependencies, instead it clones mediawiki/vendor.git which is the repo
holding dependencies for the Wikimedia cluster.

So I guess we need change it to mwext-testextension-zend-composer which
does run composer.

It is definitely worth a task in Phabricator for history purposes. But
the change itself is straightforward.


--
Antoine "hashar" Musso


_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

1 2  View All