It sometimes happens in people’s lives that someone tells them something that sounds true and obvious at the time. It turns out that it actually is objectively true, and it is also obvious, or at least sensible, to the person who hears it, but it’s not obvious to other people. But it was obvious to them, so they think that it is obvious to everyone else, even though it isn’t.

It happens to everyone, and we are probably all bad at consistently noticing it, remembering it, and reflecting on it.

This post is an attempt to reflect on one such occurrence in my life; there were many others.

(Comment: This whole post is just my opinion. It doesn’t represent anyone else. In particular, it doesn’t represent other translatewiki.net administrators, MediaWiki developers or localizers, Wikipedia editors, or the Wikimedia Foundation.)


There’s the translatewiki.net website, where the user interface of MediaWiki, the software that powers Wikipedia, as well as of some other Free Software projects, is translated to many languages. This kind of translation is also called “localization”. I mentioned it several times on this blog, most importantly at Amir Aharoni’s Quasi-Pro Tips for Translating the Software That Powers Wikipedia, 2020 Edition.

Siebrand Mazeland used to be the community manager for that website. Now he’s less active there, and, although it’s a bit weird to say it, and it’s not really official, these days I kind of act like one of its community managers.

In 2010 or so, Siebrand heard something about a bug in the support of Wikipedia for a certain language. I don’t remember which language it was or what the bug was. Maybe I myself reported something in the display of Hebrew user interface strings, or maybe it was somebody else complaining about something in another language. But I do remember what happened next. Siebrand examined the bug and, with his typical candor, said: “The fix is to complete the localization”.

What he meant is that one of the causes of that bug, and perhaps the only cause, was that the volunteers who were translating the user interface into that language didn’t translate all the strings for that feature (strings are also known as “messages” in MediaWiki developers’ and localizers’ jargon). So instead of rushing to complain about a bug, they should have completed the localization first.

To generalize it, the functionality of all software depends, among many other things, on the completeness of user interface strings. They are essentially a part of the algorithm. They are more presentation than logic, but the end user doesn’t care about those minor distinctions—the end user wants to get their job done.

Those strings are usually written in one language—often English, but occasionally Japanese, Russian, French, or another one. In some software products, they may be translated into other languages. If the translation is incomplete, then the product may work incorrectly in some ways. On the simplest level, users who want to use that product in one language will see the user interface strings in another language that they possibly can’t read. However, it may go beyond that: writing systems for some languages require special fonts, applying which to letters from another writing system may cause weird appearance; strings that are supposed to be shown from left to right will be shown from right to left or vice versa; text size that is good for one language can be wrong for another; and so forth.

In many cases, simply completing the translation may quietly fix all those bugs. Now, there are reasons why the translation is incomplete: it may be hard to find people who know both English and this language well; the potential translator is a volunteer who is busy with other stuff; the language lacks necessary technical terminology to make the translations, and while this is not a blocker —new terms can be coined along the way—, this may slow things down; a potential translator has good will and wants to volunteer their time, but hasn’t had a chance to use the product and doesn’t understand the messages’ context well enough to make a translation; etc. But in theory, if there is a volunteer who has relevant knowledge and time, then completing the translation, by itself, fixes a lot of bugs.

Of course, it may also happen that the software actually has other bugs that completing the localization won’t fix, but that’s not the kind of bugs I’m talking about in this post. Or, going even further, software developers can go the extra mile and try to make their product work well even if the localization is incomplete. While this is usually commendable, it’s still better for the localizers to complete the localization. After all, it should be done anyway.

That’s one of the main things that motivate me to maintain the localization of MediaWiki and its extensions into Hebrew at 100%. From the perspective of the end users who speak Hebrew, they get a complete user experience in their language. And from my perspective, if there’s a bug in how something works in Wikipedia in Hebrew, then at least I can be sure that the reason for it is not the translation is incomplete.


As one of the administrators of translatewiki, I try my best to make complete localization in all languages not just possible, but easy.¹ It directly flows out of Wikimedia’s famous vision statement:

Imagine a world in which every single human being can freely share in the sum of all knowledge. That’s our commitment.

I love this vision, and I take the words “Every single human being” and “all knowledge” seriously; they implicitly mean “all languages”, not just for the content, but also for the user interface of the software that people use to read and write this content.

If you speak Hindi, for example, and you need to search for something in the Hindi Wikipedia, but the search form works only in English, and you don’t know English, finding what you need will be somewhere between hard and impossible, even if the content is actually written in Hindi somewhere. (Comment #1: If you think that everyone who knows Hindi and uses computers also knows English, you are wrong. Comment #2: Hindi is just one example; the same applies to all languages.)

Granted, it’s not always actually easy to complete the localization. A few paragraphs above, I gave several general examples of why it can be hard in practice. In the particular case of translatewiki.net, there are several additional, specific reasons. For example, translatewiki.net was never properly adapted to mobile screens, and it’s increasingly a big problem. There are other examples, and all of them are, in essence, bugs. I can’t promise to fix them tomorrow, but I acknowledge them, and I hope that some day we’ll find the resources to fix them.


Many years have passed since I heard Siebrand Mazeland saying that the fix is to complete the localization. Soon after I heard it, I started dedicating at least a few minutes every day to living by that principle, but only today I bothered to reflect on it and write this post. The reason I did it today is surprising: I tried to do something about my American health insurance (just a check-up, I’m well, thanks). I logged in to my dental insurance company’s website, and… OMFG:

What you can see here is that some things are in Hebrew, and some aren’t. If you don’t understand the Hebrew parts, that’s OK, because you aren’t supposed to: they are for Hebrew speakers. But you should note that some parts are in English, and they are all supposed to be in Hebrew.

For example, you can see that the exclamation point is at the wrong end of “Welcome, Amir!“. The comma is placed unusually, too. That’s because they oriented the direction of the page from right to left for Hebrew, but didn’t translate the word “Welcome” in the user interface.² If they did translate it, the bug wouldn’t be there: it would correctly appear as “ברוך בואך, Amir!“, and no fixes in the code would be necessary.

You can also see a wrong exclamation point in the end of “Thanks for being a Guardian member!“.

There are also less obvious bugs here. You can also see that in the word “WIKIMEDIA” under the “Group ID” dropdown, the letter “W” is only partly seen. That’s also a typical RTL bug: the menu may be too narrow for a long string, so the string can be visually truncated, but it should happen at the end of the string and not in the beginning. Because the software here thinks that the end is on the left, the beginning gets truncated instead. This is not exactly an issue that can be fixed just by completing the localization, but if the localization were complete, it would be easier to notice it.

There are other issues that you don’t notice if you don’t know Hebrew. For example, there’s a button with a weird label at the top right. Most Hebrew speakers will understand that label as “a famous website”, which is probably not what it is supposed to say. It’s more likely that it’s supposed to say “published web page”, and the translator made a mistake. Completing the translation correctly would fix this mistake: a thorough translator would review their work, check all the usages of the relevant words, and likely come up with a correct translation. (And maybe the translation is not even made by a human but by machine translation software, in which case it’s the product manager’s mistake. Software should never, ever be released with user interface strings that were machine-translated and not checked by a human.)

Judging by the logo at the top, the dental insurance company used an off-the-shelf IBM product for managing clients’ info. If I ask IBM or the insurance company nicely, will they let me complete the localization of this product, fixing the existing translation mistakes, and filing the rest of the bugs in their bug tracking software, all without asking for anything in return? Maybe I’ll actually try to do it, but I strongly suspect that they will reject this proposal and think that I’m very weird. In case you wonder, I actually tried doing it with some companies, and that’s what happened most of the time.

And this attitude is a bug. It’s not a bug in code, but it is very much a problem in product management and attitude toward business.


If you want to tell me “Amir, why don’t you just switch to English and save yourself the hassle”, then I have two answers for you.

The first answer is described in detail in a blog post I wrote many years ago: The Software Localization Paradox. Briefly: Sure, I can save myself the hassle, but if I don’t notice it and speak about it, then who will?

The second answer is basically the same, but with more pathos. It’s a quote from Avot 1:14, one of the most famous and cited pieces of Jewish literature outside the Bible: If I am not for myself, who is for me? But if I am for my own self, what am I? And if not now, when? I’m sure that many cultures have proverbs that express similar ideas, but this particular proverb is ours.


And if you want to tell me, “Amir, what is wrong with you? Why does it even cross your mind to want to help not one, but two ultramegarich companies for free?”, then you are quite right, idealistically. But pragmatically, it’s more complicated.

Wikimedia understands the importance of localization and lets volunteers translate everything. So do many other Free Software projects. But experience and observation taught me that for-profit corporations don’t prioritize good support for languages unless regulation forces them to do it or they have exceptionally strong reasons to think that it will be good for their income or marketing.

It did happen a few times that corporations that develop non-Free software let volunteers localize it: Facebook, WhatsApp, and Waze are somewhat famous examples; Twitter used to do it (but stopped long ago); and Microsoft occasionally lets people do such things. Also, Quora reached out to me to review the localization before they launched in Hebrew and even incorporated some of my suggestions.³

Usually, however, corporations don’t want to do this at all, and when they do it, they often don’t do it very well. But people who don’t know English want—and often need!—to use their products. And I never get tired of reminding everyone that most people don’t know English.

So for the sake of most humanity, someone has to make all software, including the non-Free products, better localized, and localizable. Of course, it’s not feasible or sustainable that I alone will do it as a volunteer, even for one language. I barely have time to do it for one language in one product (MediaWiki). But that’s why I am thinking of it: I would be not so much helping a rich corporation here as I would be helping people who don’t know English.

Something has to change in the software development world. It would, of course, be nice if all software became Freely-licensed, but if that doesn’t happen, it would be nice if non-Free software would be more open to accepting localization from volunteers. I don’t know how will this change happen, but it is necessary.


If you bothered to read until here, thank you. I wanted to finish with two things:

  1. To thank Siebrand Mazeland again for doing so much to lay the foundations of the MediaWiki localization and the translatewiki community, and for saying that the fix is to complete the localization. It may have been an off-hand remark at the time, but it turned out that there was much to elaborate on.
  2. To ask you, the reader: If you know any language other than English, please use all apps, websites, and devices in this language as much as you can, bother to report bugs in its localization to that language, and invest some time and effort into volunteering to complete the localization of this software to your language. Localizing the software that runs Wikipedia would be great. Localizing OpenStreetMap is a good idea, too, and it’s done on the same website. Other projects that are good for humanity and that accept volunteer localization are Mozilla, Signal, WordPress, and BeMyEyes. There are many others.⁴ It’s one of the best things that you can do for the people who speak your language and for humanity in general.

¹ And here’s another acknowledgement and reflection: This sentence is based on the first chapter of one of the most classic books about software development in general and about Free Software in particular: Programming Perl by Larry Wall (with Randal L. Schwartz, Tom Christiansen, and Jon Orwant): “Computer languages differ not so much in what they make possible, but in what they make easy”. The same is true for software localization platforms. The sentence about the end user wanting to get their job done is inspired by that book, too.

² I don’t expect them to have my name translated. While it’s quite desirable, it’s understandably difficult, and there are almost no software products that can store people’s names in multiple languages. Facebook kind of tries, but does not totally succeed. Maybe it will work well some day.

³ Unfortunately, as far as I can tell, Quora abandoned the development of the version in Hebrew and in all other non-English languages in 2022, and in 2023, they abandoned the English version, too.

⁴ But please think twice before volunteering to localize blockchain or AI projects. I heard several times about volunteers who invested their time into such things, and I was sad that they wasted their volunteering time on this pointlessness. Almost all blockchain projects are pointless. With AI projects, it’s much more complicated: some of them are actually useful, but many are not. So I’m not saying “don’t do it”, but I am saying “think twice”.

On June 28, 2024, the WikiForHumanRights (W4HR) in Nigeria 2024 Campaign hosted its virtual launch, attracting 100 participants from 21 states in Nigeria. The event, led by 2 National coordinators, 7 community coordinators, and 6 working team members as part of the W4HR 2024 international campaign, celebrated the 75th anniversary of the Universal Declaration of Human Rights. The meeting lasted for 2 hours and aimed to encourage various Wikimedia communities to create knowledge that documents: “How Human Rights Knowledge Creates a Sustainable Future.”

The W4HR in Nigeria 2024 campaign focuses on raising awareness and understanding of human rights issues related to sustainability in Nigeria. It addresses topics such as sustainable agriculture, environmental sustainability, clean water, and waste management. The campaign identifies significant knowledge gaps on Wikipedia, Wikidata, Wikimedia Commons, and Wikivoyage regarding these issues in Nigeria, particularly in local languages like Igala, Yoruba, Igbo, Tyap, and Hausa, which hampers community awareness and engagement. The campaign aims to achieve a target of 1,230 articles, items, and media files through content expansion, translation, and creation. To reach this goal, 148 new and existing editors were recruited across 7 communities, exceeding expectations before the virtual launch with 228 registered participants. These participants include students and professionals from different communities, who are passionate about enhancing quality of information and accessibility, aligning efforts with the Sustainable Development Goals (SDGs), and international human rights standards.

Key Highlights

The virtual launch, moderated by Miracle James, commenced with an inspiring talk from Euphemia Uwandu, Program Officer for Campaign Programs at the Wikimedia Foundation. Euphemia, who also serves as the Coordinator and general overseer of the WikiForHumanRights Campaigns, set the tone by providing a brief introduction to the campaign, highlighting its aims, objectives, and goals, and encouraging participants to contribute. The session proceeded to showcase one of our partners as part of our Partners Spotlight activities, aiming to educate participants on how they can contribute to the campaign by learning from Civil Society Organizations (CSOs) and Non-Governmental Organizations (NGOs) working on similar themes. The spotlight was on Plogging Nigeria, a non-profit organization that promotes environmental sustainability through organized activities called ‘Plogging Episodes’ among others. Mayokun Iyaomolere, the Founder of Plogging Nigeria, spoke about their initiatives aligning with the W4HR in Nigeria Campaign, with a specific focus on waste management and environmental sustainability.

The session then advanced with an overview from Kemi Makinde, one of the National Coordinators for the W4HR in Nigeria 2024, who enlightened participants on the theme and relevance of the campaign, as well as the timeline of implementation. This was followed by an introduction to the experienced trainers and reviewers who will be leading the general training, as documented on the campaign’s homepage. The trainers and reviewers, carefully selected based on their previous experience in leading similar trainings and coordinating campaigns, included:

  1. Rhoda James– is the Wikidata Trainer for the W4HR in Nigeria 2024 and the creative director at the Wikimedia User Group Nigeria.
  2. Iwuala Lucy– is the Wikivoyage and Wikimedia Commons Trainer and Reviewer for the W4HR in Nigeria 2024 and a language professional with a degree in Language Studies. She is an advocate for indigenous language revitalization and free knowledge dissemination. She is also a member of the Regional Grant Committee, Charter Electoral Committee and Social Media Manager/Member, Wiki Loves Monuments International Team
  3. Barakat Adegboye– is the W4HR in Nigeria 2024 Wikipedia Trainer and a Wikimedia Volunteer from Nigeria. She also serve as the Programs Lead (Intern) at the Wikimedia User Group Nigeria.
  4. Muib Shefiu– is an experienced Wikimedia Editor who has created hundreds of articles on the English Wikipedia. He has organised and facilitated different Wikimedia programmes. He is the founder of Afrodemics and the winner of the 2023 Wikimedia User Group Nigeria Editor of the year award. Currently, he is a new page reviewer on the English Wikipedia and also a reviewer for this Campaign. 
  5. Blessing Linason – is the Wikidata Reviewer for the W4HR in Nigeria 2024 and the Co-founder Kwara State University Wikimedia Fan Club.

The session further proceeded with an introduction to the local coordinators who shed more light on their individual community activities and expectations from their members as champions contributing to this campaign. The communities included:

  1. WUGN Anambra Network led by Dr. Ngozi Osuchukwu
  2. Igala Community led by Agnes Abah
  3. WUGN Osun led by Adetoro Praise
  4. Port Harcourt Wikimedia Nigeria Hub led by Jeremiah Ugwulebo
  5. WUGN Kaduna Network led by Ramatu A Haliru
  6. WUGN Imo Imo Network led by Emmanuel Obiajulu
  7. WUGN Gombe Network led by Ismael Atiba

Upon completion of the introductions from both local coordinators and the working team, as one of the National Coordinators for this campaign, I (Bukola James) took the lead in walking participants through the Campaign  home page, resources, topic List, and reports to support their contributions. This was followed by a practical explanation by the reviewers, explaining the score guide and criteria for rewards at the end of the project.

Conclusion

The virtual launch concluded with an interactive Q&A session where participants had the opportunity to express their concerns and ask questions for clarity. At the end of the Q&A, participants had the opportunity to interact and network among themselves.

For those who may have missed the session, the link to access it is available on the community training schedule. Additionally, we encourage you to register under any of the 7 organizing communities to be part of our upcoming training sessions and to ensure you receive timely email notifications one hour before each online session begins. Let’s work together to create knowledge for a sustainable future and bridge the gap in information about sustainable agriculture, environmental sustainability, clean water, and waste management.

A big thank you to all the Wikimedia Foundation, Coordinators, Working Team, Partners, and Participants!

As part of the Wiki Women in Red @8 campaign in 2023, we collaborated with various organizations, such as the University of Professional Studies Office of the Women’s Commissioner and the former Women’s Commissioner for the University of Ghana to conduct a series of workshops and Trainings empowering over 60+ people new editors through various in-person and online workshop. There was also an online contest for both existing and new editors to participate. These workshops educated and empowered women especially with Wikipedia and Wikimedia Commons Skills. Part one of this article provided an overview of the campaign and the various activities which took place. In part two of the article (this post) we highlight some of the challenges and how we navigated our way through the campaign and what participant had to say.

Challenges

IP block

One of the main challenges encountered throughout this training was IP Block on English Wikipedia. Although many of the participants were eager to edit Wikipedia, this was a major challenge. To counter this challenge we asked participants to create their Wikipedia account through Wikimedia Commons. We also tried to create wikipedia accounts for others using the dashboard. And later on requested participants to email Graham with their username and IP. Even after all these steps they still couldn’t edit on English Wikipedia. This experience was very discouraging for the participants as they are eager to make improvements to the wiki page and to practice. As a result we had to focus more on creating articles on Wikidata and uploading Images on Wikimedia Commons which was fulfilling. At the training most did not have images about women so resorted to uploading some exiting images not related to the topic for their hands-on practice. We noticed that this experience is usually encountered by Newbies. We hope something can be done about this for future events. 

Lack of sources for women articles

Sources can be a challenging when writing about women. Example – There are a lot of women football teams in Ghana, however one for the challenges was the fact that they lacked sources/references to create the articles. For those that even had some few publications not much was documented about them. In future we will also push some efforts into translating existing articles 

Notability on Wikidata

Wikidata was one of the projects the campaign focused on. Some Wikidata articles that were documented by the volunteers were deleted due to some notability criteria which some of the team members felt the subject was notable. We believe that issue of notability is not universal and can vary from community to community. Nevertheless we ensured that guidelines are adhered to avoid further deletion.

Excess participants

Another challenge was that the campaign attracted a lot of new participants, over a 100 new  participants on our whatsApp group who were eager to learn new skills using wikipedia. As a result we were unable to extend one on one support and host all members during the in person training although they were eager to participate. Another challenge was also the location which required us to host them, this didn’t now allow us to accommodate participants from afar, however we held general online training session which some benefitted from.

Challenge using Dashboard

Using the dashboard was great, however we had a lot of work digging into the data to discover what was relevant for our campaign. In doing that exercise we realized that a lot of articles that were edited within the period of the campaign were captured on the dashboard. We found this by checking the article history. That exercise, although cumbersome, helped us to identify the actual articles and contributions made by participants of the campaign.

Event registration tool

Another challenge we encountered was about creating the event registration page. First challenge encountered was the delay in granting our request to be able to create the campaign page.  We had to make a request for an organizer right before we could actually create an event registration tool. Due to this delay we used Goggle form to start recruiting and then switched back to the event registration tool when it was ready leading to duplication of registration.

Secondly we had already put up our main campaign page already set up and registration ongoing. We also learned that we could not build the registration tool on the existing campaign meta page. this led us to create a separate registration page. In future we will take note of this as having the event tool on the main campaign page minimize duplication.

After the approval, we couldn’t embed the registration tool on the existing campaign page we created. We learned that it should have been created from the onset of starting a meta part. This led us to create another page for the event registration.

Another things we learned was that because we had created a dashboard already with editors signed on already, we couldn’t also link that dashboard to the event registration page that was being created because that was not possible so we maintained the already created dashboard.

We also had multiple events we were recruiting at the same time. This led us to use Google form alongside since we did not want to create another event registration page. 

We couldn’t add our own questions to the event registration questions for evaluation purposes

In spite of the challenges with the registration tool, we found it very easy to and had features that provided us information about participants, also we were  able to send messages through the event registration page to participants to keep them posted. This tool is a game changer for wikimedia organizers and more impactful with continues improvement it will serve its ultimate purpose in the community.

Recommendations

Although the on wiki registration tool is great there are still some improvements needed to make it easier for organizers to use . Some of the recommendations from our team are;

1. All organizers should be given access rights to be able to create their event registration page or Speedily granting organizers to create registration pages when request is made.

2. We recommend a way that is flexible for the event page to be adjusted and refreshed- Another question we asked was what happens if the same page is maintained for another event the following year. Would that mean we keep creating a separate event page or how can we amend the existing registration to make way for the new campaign?

3. More education is needed around the limitation of the tool to help organizers mitigate ahead. Eg you cannot create registration page on an already existing meta page, The registration page creation works along with the dashboard creation. You cannot embed your existing dashboard to the registration page.

4. In future we hope that the registration tool can adapt some of the features from the Google form in terms of providing downloadable infographics of summary characteristics of the registered participants and ability to download

5. Event registration tool questions should be made flexible for other questions to be added to meet organizers needs. Eg we are not able to even ask the origin of the participants, and other questions we will want to understand about our target population.

6. There should be an option for on wiki messaging tool that sends messages to participants whose emails were not linked to their account to be able to receive update notifications on then wikipedia account . At the moment the mass messaging is only for emails.

Opportunities

This campaign opened a lot of opportunities for us as an organization. This includes partnering with the University of professional Studies who is looking forward to more of such training for their students and the University of Ghana. 

After learning about our Wikipedia training program, we were reached by Doctor Yaw at the Center for Climate Change and Sustainability Studies at the university of Ghana for a partnership with their institutions to incorporate Wikipedia education as part of their semester learning activities for Masters Students and undergraduate students. During our partnership discussion we identified possible collaboration opportunities that also included starting a wiki club for the Sustainability and Climate change Association. We see that this could be one of the long term goals to build their capacity on wikipedia , map content gap areas, and bridge knowledge gap areas around sustainability and Gender as that is also one of their work.  We foresee the possibility of engaging with them as part of the WikiForHumanRights Campaign.

To sustain our communities interest and offer continued support, we constantly share online events eg from the Let’s Connect Telegram as well as other events to benefit from. Continues program and engagements are needed to help

Resources and Support

Some of the ways we resolved IP Block issues was guiding participants to send emails to an admin requesting IP Unblock. While some felt it was a long process others were able to make the request and had IP unblock days after the training which allowed them to improve existing Wikipedia articles. 

We also had office hours to help them with IP Unblock as well as made a video recording on how to get IP Unblock which we hosted on our Youtube Channel. 

We also created a special Wikipedia Community WhatsApp group where we enrolled all that were interested in the campaign to learn about Wikipedia. The campaign attracted many people to join our Wikipedia and we currently have 190 members where we will be engaging from time to time to build their capacity on Wikipedia

As part of the resources we also created an article list for participants to start with, almost all of which have been created. We also shared article list for Wiki Women in Red

We received tremendous support from 2 staff at the wikimedia foundation event registration team Euphemia and Ilana who trained us on how to use the event registration tool . They also assisted us to activate wiki bulk sms to send on invitation to participants who could be potential candidates for the campaign. Although that also had its own limitations, it was exciting to have made use of it to invite 20 experienced editors to join the contest although we didn’t get much response from that outreach. 

We also had massive support from Wiki Women in Red liking and resharing our campaign on Twitter which boosted the campaigns visibility. This was really encouraging. We also exchanged some beautiful swags  with Wiki Women in Red in Scotland as a kind gesture and some of the swags they sent us were given to contributors of the campaign. 

Experienced Editors: Having experienced editors providing us with guidance and support was very encouraging and contributed to the success of the program. With their wealth of experience they provided support where they could. They assisted with me mentoring selected participants who demonstrated willingness to learn. They also trained my team to perform some tasks like editing the metapage, creating the dashboard, leading some of the training amongst others. This has really increased the editing skills of our team and some have continued to join other campaigns contributing to translations. Some of the new bees have joined our wikipedia team to support us during future trainings and campaigns.

Special thanks to the wonderful team and resource persons who supported with the trainings; Jesse Aseidu Akrofi, Ruby D-Brown, Queen Murjanatu, Garbrialla, Anita Ofori, Kojo Owusu, Phillip.

Results

Although the theme was documenting articles about women in sport, we realized that participants wanted to not only document women in sport. When we asked them what topics they were interested to contribute to, Women in Tech, Women Empowerment, stood in addtion to women in sports. We went on to create wiki data items and wikipedia articles about women sport organization/teams and organizations that are into women, Women in sustainability education etc. 

We were cognisant of the fact that the dashboard sometimes tracks other sources of contributions. Although some of the newcomers who joined wrote about other subjects which were not about women, for the purposes of the impact attained we tracked articles related to women on the dashboard and here is the statistics we gathered.

Articles related to women that we tracked from the dashboard were;

Wikipedia article improved and newly created – 658

New Wikidata Items- 600+

Wikimedia Commons images – 972

Testimonials

The testimonial participants gave after the event were encouraging as  most of them were excited about gaining this new knowledge and are enthused. We  have shared video testimonials. 

“Impressive”

“Educational”

“Wiki women in red adding our voice as women”

“The best, the fun, the zeal”

“It’s absolutely great to gain editing skills on Wikipedia.”

“The best people to lift women are women so let’s work together and support each other to get to the top”

“Much grateful for this initiative, i have learnt a lot within a short period of them and this has opened my minds to a lot of things on wikipedia”

“Wiki Women in Red workshop is quite an essential training workshop for women. It does not only promote women’s image and brand on the internet but equips others with very important digital skills in today’s world.”

“This is a good initiative. We need more of these. African Women deserve a spotlight too. Thank you so much for doing this!”

“I just love volunteering to anything about Wikimedia Foundation”

“That it’s easy and all it takes it dedication”

How has the Wikipedia Training changed your perspective about Wikipedia?

“I appreciate how far authentic recognition goes and the verification process of information”

“Anyone can edit on Wikipedia but one need to follow the five rules”

“It has helped me a lot as a blogger and a digital marketer, i have learned a new skill through this program, a skill have been wishing to learn for long.”

“This training has helped me realize that we can put more notable people out there because it’s people like me that put the information on wikipedia”

“Anyone can edit wikipedia”

“They are into women sustainability group that helps young ladies”

“I thought it was difficult and complicated to work edit on Wikipedia or I thought Wikipedia information was from an advanced source but now I know that it is simple”

“The training had thought me about the gender bias on Wikipedia and the need to level up”

“That Wikipedia can be a reliable source of information”

“Women need to be represented and works edited on Wikipedia is for everyone. Anyone can correct whatever information you put out there but credit will still be given to you”

“Wikipedia is free to edit and is for everyone”

Data

Gender

  • 75% – Women
  • 25% – Men

Age Group

  • 63.5% – 20-25
  • 12% – 25 – 30
  • 18.8% – 35 +

Is this your first time hearing about Wiki Women in Red?

  • 50%- Yes
  • 50%- No

Is this your first time participating in a Wikipedia campaign/Training?

  • 68.8%- Yes
  • 31.3% – No

If  YES were you able to create your wikipedia account?

  • 93.8% -Yes
  • 6.3% – No

Gallery

IA Upload upgraded

Tuesday, 16 July 2024 09:49 UTC

Fremantle

· IA Upload · PHP · upgrades · Wikimedia ·

I shifted IA Upload on to a new server today, where it's running on Debian 12 and PHP 8.2. So that means it's time to upgrade the tool's PHP dependencies, and as it's a Slimapp app, it seems that the first step is to get simplei18n working with a more modern version of Twig. So it's not going to get done today, it seems…

Wikimedia Côte d’Ivoire’s plans for 2027

Tuesday, 16 July 2024 09:48 UTC

“By 2030, Wikimedia will become the essential infrastructure of the free knowledge ecosystem, and all those who share our vision will be able to join us.” As a stakeholder in the development of the Wikimedia movement’s 2030 strategy, the Côte d’Ivoire User Group adheres to the ideals of equity and service advocated by the aforementioned strategic direction. It has been deploying a strategic action plan at the local level since January 2024.

The Ivorian Wikimedia community met in 2021 and 2022 to explore and discuss local needs, while integrating the recommendations of the international Wikimedia movement for the 2030 horizon. This participatory approach has laid the foundations for an ambitious strategy, articulated around five key priorities, designed to stimulate engagement, collaboration and impact in Côte d’Ivoire, West Africa and the French-speaking world.

This strategic proposal was submitted and adopted at the General Assembly in April 2023. this decision marks the start of a new era for the organization, which is firmly committed to shaping the future of free and open knowledge in Côte d’Ivoire.

Between now and 2027, Wikimedia Côte d’Ivoire will have to invest in all Wikimedia platforms by developing more local content, enhancing the involvement of volunteers, opening up to new partnerships and developing strategic tools, while respecting the standards of the Wikimedia movement”, we can read at the page … of the  strategic action plan.

At the heart of this plan lies an unwavering commitment to enriching Wikimedia platforms with local content, aimed at preserving and promoting the richness of Ivorian and African culture on a global scale. At the same time, Wikimedia Côte d’Ivoire will strive to forge strategic partnerships with key players, whether traditional (Galery, Library, Arts and Museums) GLAMs, the Education-Training sector, the media, public institutions, civil society organizations or even private companies, in order to strengthen access to knowledge and amplify its impact across the country.

At the same time, the organization will focus on institutional and community development, as well as investment in the human and material resources needed to sustain its activities. Finally, particular attention will be paid to mobilizing diversified financial resources, crucial to guaranteeing the long-term viability of its initiatives.

The strategic axes defined by Wikimedia Côte d’Ivoire reflect its deep commitment to the promotion of free knowledge and equity in access to information. By following these axes, the organization actively contributes to the realization of the Wikimedia Foundation’s global vision, shaping a future where knowledge is accessible to all, without barriers or borders.To find out more about Wikimedia Côte d’Ivoire’s strategic direction and impact, please visit https://shorturl.at/efJKY.

Redesigned Wikimedia wishlist is open

Tuesday, 16 July 2024 04:40 UTC

Fremantle

· Wikimedia · Community Tech · work ·

The new system for the Community Wishlist was launched yesterday. It replaces the old annual system of having a set period each year when people can propose wishes, with some weeks following of voting etc. In the new system, wishes get submitted whenever, and are gathered together into focus areas and those are what will be voted on (again at any time).

I think it's an improvement. The software for running it certainly is! We've built a data entry form, which reads and writes a wikitext table. There are also other parts that read all the wish templates into a (Toolforge) database and then write out various tables (all wishes, recent ones, etc.) into wiki pages.

There's more info about the launch in a Diff post: Share your product needs with the Community Wishlist

In Janice Radway’s classic Reading the Romance of 1984, she referred to the romance-purchasing customers of a small-town bookstore as a “female community … mediated by the distances of modern mass publishing. Despite the distance, the Smithton women feel personally connected to their favorite authors because they are convinced that these writers know how to make them happy” (Radway 1991, 97).

Reading the Romance is an important work because it gave attention to an otherwise dismissed genre and conceived of the readership as a community, even if only vaguely. Radway partly improved on this in her 1991 edition, admitting her theorization of community was “somewhat anemic in that it fails to specify precisely how membership in the romance-reading community is constituted.” Radway conceded the concept of an “interpretative community” (previously used to refer to critics and scholars of literature) might help, but “it cannot do complete justice to the nature of the connection between social location and the complex process of interpretation” (Radway 1991, 8).

This notion of “interpretive community” was coined in the seven years between her first and second editions. And, as she noted, it wasn’t a great fit. An “interpretive community” is a “collectivity of people who share strategies for interpreting, using, and engaging in communication about a media text or technology” (Lindlof 1988, 2002). Radway’s subjects shared little of this.

Rather, Radway was speaking of parasocial relationships between the readers and the author where mass media permit an “illusion of a face-to-face relationship with the performer” (Horton and Wohl 1956, 215)—the authors, in Radway’s case.

It’s interesting that while the concept of parasociality had existed for decades, Radway overlooked it and instead reached for the wrong one: interpretive communities.

References

Horton, Donald, and R. Richard Wohl. 1956. “Mass Communication and Para-Social Interaction.” Psychiatry 19 (3): 215–29. http://dx.doi.org/10.1080/00332747.1956.11023049.
Lindlof, Thomas R. 1988. “Media Audiences as Interpretive Communities.” Annals of the International Communication Association 11 (1): 81–107. http://dx.doi.org/10.1080/23808985.1988.11678680.
———. 2002. “Interpretive Community: An Approach to Media and Religion.” Journal of Media and Religion 1 (1): 61–74. http://dx.doi.org/10.1207/S15328415JMR0101_7.
Radway, Janice. 1991. Reading the Romance: Women, Patriarchy, and Popular Literature. Chapel Hill: University of North Carolina Press.

Tech/News/2024/29

Tuesday, 16 July 2024 01:39 UTC

Latest tech news from the Wikimedia technical community. Please tell other users about these changes. Not all changes will affect you. Translations are available.

Tech News survey

Recent changes

  • Advanced item Wikimedia developers can now officially continue to use both Gerrit and GitLab, due to a June 24 decision by the Wikimedia Foundation to support software development on both platforms. Gerrit and GitLab are both code repositories used by developers to write, review, and deploy the software code that supports the MediaWiki software that the wiki projects are built on, as well as the tools used by editors to create and improve content. This decision will safeguard the productivity of our developers and prevent problems in code review from affecting our users. More details are available in the Migration status page.
  • The Wikimedia Foundation seeks applicants for the Product and Technology Advisory Council (PTAC). This group will bring technical contributors and Wikimedia Foundation together to co-define a more resilient, future-proof technological platform. Council members will evaluate and consult on the movement’s product and technical activities, so that we develop multi-generational projects. We are looking for a range of technical contributors across the globe, from a variety of Wikimedia projects. Please apply here by August 10.
  • Editors with rollback user-rights who use the Wikipedia App for Android can use the new Edit Patrol features. These features include a new feed of Recent Changes, related links such as Undo and Rollback, and the ability to create and save a personal library of user talk messages to use while patrolling. If your wiki wants to make these features available to users who do not have rollback rights but have reached a certain edit threshold, you can contact the team. You can read more about this project on Diff blog.
  • Editors who have access to The Wikipedia Library can once again use non-open access content in SpringerLinks, after the Foundation contacted them to restore access. You can read more about this and 21 other community-submitted tasks that were completed last week.

Changes later this week

Future changes

  • Advanced item Next week, functionaries, volunteers maintaining tools, and software development teams are invited to test the temporary accounts feature on testwiki. Temporary accounts is a feature that will help improve privacy on the wikis. No further temporary account deployments are scheduled yet. Please share your opinions and questions on the project talk page. [1]
  • Editors who upload files cross-wiki, or teach other people how to do so, may wish to join a Wikimedia Commons discussion. The Commons community is discussing limiting who can upload files through the cross-wiki upload/Upload dialog feature to users auto-confirmed on Wikimedia Commons. This is due to the large amount of copyright violations uploaded this way. There is a short summary at Commons:Cross-wiki upload and discussion at Commons:Village Pump.

Tech news prepared by Tech News writers and posted by bot • Contribute • Translate • Get help • Give feedback • Subscribe or unsubscribe. You can also get other news from the Wikimedia Foundation Bulletin.

I am a big fan of writing 550-word blog posts, but fewer and fewer people these days are fans of reading them. (Short) video is increasingly dominant.

In 2024, Wikimedia Ukraine created a series of eight video tutorials on Wikipedia and other Wikimedia projects. Each 3-7 minute video resulted from a collaborative effort involving scriptwriting, professional recording, and expert editing.

The tutorials, available in Ukrainian, cover topics such as an introduction to Wikidata, Wikipedia’s notability guidelines, and whether you can trust Wikipedia. We published them on Wikimedia Commons and YouTube.

As we plan to develop more videos, here are three key lessons and reflections on our process so far.

Nesterenko Olya, Ukrainian Wikipedia administrator, during a video tutorial recording in January (photo: Viktor Chornomyz, CC BY-SA 4.0)

1. Preparing a short video tutorial is difficult but rewarding

We’ve been organizing webinars for the past four years, with recordings freely available afterwards. But let’s be realistic – few people will have the time and attention span to watch a 60-minute video on a topic they want to learn.

Summarizing a complex topic like Wikidata into a concise 5-minute video requires careful phrasing and focus. How do you include all important aspects, while not overloading viewers with too much detail? 

Interestingly, an AI chatbot like ChatGPT is helpful in writing an outline and breaking down a big topic into a series of smaller steps. 

The outline ChatGPT suggested for the video on categories was helpful to structure the script (screenshot by Anton Protsiuk)

We hope that the tutorials will be helpful for many viewers. But developing them has also been a useful experience for the creators. Thanks to this exercise, we’ve become better at helping and mentoring newcomers – and learned a lot ourselves. The best way to learn something is to teach it to someone else.

2. Other communities have already done a lot of work that can inspire, if not be reused

Video tutorials are a popular genre in the Wikimedia movement. The Commons category includes hundreds of files from the Wikimedia Foundation, Wikimedia Estonia, Wikimedia Community User Group Turkey, and many more organizations and individual volunteers.

It’s hardly possible to reuse them “as is” because of language and context differences. (And, frankly, many of them are outdated by now). But there are a lot of interesting video projects that have been a source of inspiration – for example, “A Wiki Minute” series from the Wikimedia Foundation.

3. This work is a long-term investment, not a chase for views

Some video tutorials we’ve developed have garnered a lot of views – for example, the video on creating a Wikipedia article published a few years ago has received close to 10,000 views on YouTube. 

That said, we’ve decided not to prioritize chasing views by tools like clickbait titles or paid social media promotion. Tutorials are a long-term investment that will serve people who look up a certain topic on Wikipedia or Google. It’s also a good tool for event organizers who will be able to rely on instructional videos during training sessions rather than developing material from scratch. 

Organizers of a local Wikimarathon event in February 2024 are demonstrating a video tutorial recorded by Wikimedia Ukraine (photo: Еколог Світлана, CC0)

Plus, when I get a question about Wikipedia from someone, it’s easier to send a link to a video rather than rewrite an explanation for the hundredth time 🙂 

More information & read also: 

Have you ever wished for a new tool or feature that would make editing easier on the wikis? In order for Wikimedia’s projects to grow and thrive, volunteers need software that helps them write, cite, upload, and collaborate easily and across languages. Product teams at the Wikimedia Foundation build and maintain software throughout the year, and seek collaboration with volunteers to build, test, and deliver improvements. To support this collaboration, the Foundation has now re-opened one of our primary channels for technical suggestions and ideas: the Community Wishlist

The Community Wishlist is Wikimedia’s forum for volunteers to share ideas (called wishes) to improve how the wikis work. Unlike the previous iteration, the wishlist is always open, and volunteers can submit wishes in any language. Wishes should reflect product or technical challenges that volunteers encounter, and give space for developers, designers, and engineers to partner with users in building solutions together. Once a wish comes in, Wikimedia Foundation teams, volunteer developers, and Wikimedia affiliates will review open wishes and select areas to build on.

Volunteers can engage with the Community Wishlist on three levels: submit, vote, and build. 

  1. Submit a Wish: The new Community Wishlist enables volunteers to share their wishes in both wikitext and the Visual Editor, and using their preferred language. Volunteers may also submit as many wishes as they’d like, at any time, but they must be logged into MetaWiki. Here’s how to get started:
  • Navigate to the Community Wishlist home page and click “Submit Wish,” then complete the following required fields
    • Name: a name for your wish
    • Description: the problem you want to solve.
    • Type: a feature request, bug report, system change, or something else.
    • Project: Wiki projects associated with the wish.
    • Affected users: A description of users who’d benefit from the wish being solved
    • Users may optionally share a Phabricator ticket.
  • Press Submit. That’s it!
  1. Vote on Focus Areas: Wikimedia Foundation teams will review a new wish for relevance and completion and, if applicable, group it into a Focus Area, which represents a cluster of wishes with the same underlying problem. Focus Areas will be initially created by the Foundation, and volunteers may vote and comment on Focus Areas to signal opportunities in need of prioritization. 

Wikimedia Foundation teams will then adopt Focus Areas as part of the Annual Plan. This is an important change from the previous wishlist, while multiple Foundation teams worked to fulfill community wishes (see: Dark Mode, Edit Check), this wasn’t done consistently and at scale. Moving forward, the wishlist will serve as a central pipeline for surfacing community technical needs into Product & Technology annual planning, which is how major resourcing decisions are made. Starting in 2024-5, the Foundation has committed at least two teams (Community Tech and Moderator tooling) to examine the Wishlist, adopt, and address Focus Areas. As we evaluate community feedback on Focus Areas, additional teams may also adopt Focus Areas for this year. By 2025-2026, we anticipate even more wishes filtering into annual planning, with the acknowledgement that not every wish will be incorporated in a Focus Area, and not every wish or Focus Area will be worked on by the Foundation. 

We will be introducing our first set of Focus Areas at Wikimania, and invite volunteers to collaborate with us on future Focus Areas. 

  1. Build technical solutions. The Foundation, affiliates, and technical stakeholders may adopt Focus Areas and continue to collaborate with contributors to further develop and build a new idea. 

The Community Wishlist is important to building a multi-generational movement; in order to build sustainable software, we need to hear from, and collaborate with volunteers about challenges and opportunities to improve our product and technology. Anyone in the movement can engage with Wishes, and collaborate with fellow volunteers and Wikimedia Foundation to build better software.

We’re so excited to reopen Community Wishlist with you, and can’t wait to see what problems and opportunities you have to share with each other.

Celebrating Ellie

Monday, 15 July 2024 13:00 UTC

The Wikimedia movement is built on collaboration, so it is a natural place for people who find joy in working with others, supporting them and learning through supporting their peers. Today we Wikicelebrate one such person: Ellie from El Salvador, a passionate Wikipedia trainer, event organizer and someone who finds motivation and joy in seeing others grow. 

Like many now-contributors, Ellie discovered the Wikimedia movement through participating in an outreach activity. It was 2019 and Ellie took part in an Art and Feminism workshop held in the Centro Cultural de España en El Salvador. This event, dedicated to enriching Spanish Language Wikipedia content related to notable women from El Salvador, taught Ellie how to make first steps in the Wikimedia world, create her own user page, look for sources, discuss with other editors,  and (most importantly) add content to Wikipedia. For her, a professional journalist and communicator, this newly discovered wikiworld, focused on sharing information and storytelling, was fascinating from the very first moment. 
She started making her first edits and thanks to Ellie, Spanish Wikipedia gained content about female athletes, like swimmer Aurora Chamorro or long distance runner, Vanessa Veiga. She also engaged in Wikimedia Commons, where you can find her uploads of exhibits from Museo de Artes y Tradiciones Populares de El Salvador, and in Wikidata.

Soon after becoming an editor, Ellie took a deeper dive into the Wikimedia community. In 2021 Karla and Cristina, two wikimedians who have been enriching Wikipedia articles  about El Salvador in Spanish Wikipedia, invited her to support forming a new Wikimedia organization. This organization would not only help them in promoting Wikimedia projects in their country, but would also help their work of growing content about El Salvador in the online encyclopedia. There is still much to tell the world about this Central-American country with its incredibly rich culture,  fascinating history, and tasty cuisine (just take a look at the article about pupusa!) – as wikimedians from El Salvador point out themselves, there is still a significant gap in biographical articles about Salvadorans in Wikipedia. Ellie was encouraged to participate, and thanks to teamwork on May 2021 (which was the day when Spanish Wikipedia celebrated its 20 birthday), Wikimedistas El Salvador was borned.And since then the team gave workshops about Wikipedia  in cultural centers, museums, institutes of social studies and trained librarians, scientists, university students, and the general public on how the platform works. They collaborated with the Olympic Committee of El Salvador, the Latin American Council of Social Sciences and the Cultural Center of Spain in El Salvador.

  

“Maybe I see it as very long term, but the digital gap in El Salvador is still a challenge. I would like Wikimedia projects to be a space with academic information and scientific dissemination, a tool to enhance knowledge about El Salvador and the Latin American region.”, she says when asked about what keeps her going.  Ellie also takes great joy in collaborative work and supporting others: “I am happy editing but I am also happy teaching others how to edit, to know the platforms and the projects with the purpose of enriching the free knowledge. The most beautiful moments are when I finally publish another article, and when some of the people I have trained at a workshop publish their first article and are encouraged to teach others about the project”, she comments. 

Unfortunately not everything is a bed of roses when you’re a newcomer in the movement and Ellie shares that there were challenges in terms of editing efforts., “Like when an article is deleted without warning because some important character or topic in my country is not considered relevant, this is because people do not know the context and history of countries in the region”, Ellie shares her difficulties. “But it is important to be positive and take those difficulties as an incentive to continue learning and releasing knowledge”, she adds. 

One thing she would like to share with other Wikimedians is the importance of always remembering your beginnings: “Although we have a long history as editors, let us not forget that we were once newcomers”, she says “let’s guide the new editors with civility and respect”.

Apart from that, she names three main global challenges for the Wikimedia movement: How not to lose the collaborative spirit, (the we), how not to forget the main purpose of the project (not to lose the essence), and how to adapt to the new generations and technologies that are building their own definition of knowledge (to maintain openness).

Ellie would like the world to know more about Wikipedia, especially that (in her own words) “Wikipedia is a project with digital tools that contribute to human development, universal access to information, knowledge and education. Wikipedia is a space where you can grow and learn in a community, because when you work on the same goal (free knowledge) language barriers and borders do not exist, we all, volunteers, are Wikimedia”.

Thank you for being part of this amazing community of volunteers, Ellie. Today we celebrate your commitment to the Wikimedia movement. 

Tech News issue #29, 2024 (July 15, 2024)

Monday, 15 July 2024 00:00 UTC
previous 2024, week 29 (Monday 15 July 2024) next

Tech News: 2024-29

weeklyOSM 729

Sunday, 14 July 2024 09:59 UTC

04/07/2024-10/07/2024

lead picture

Gallery of Overpass Ultra map examples [1] | © dschep, uMap | map data © OpenStreetMap contributors

Mapping campaigns

  • The humanitarian collaborative mapping campaign in response to the 2024 Rio Grande do Sul Floods (Brazil) is ongoing. The effects of the disaster that led to landslides, floods, and a dam collapse persist and 5,000 people are still homeless in the state. Everyone can collaborate in the open projects.

OpenStreetMap Foundation

  • OpenStreetMap experienced a DDoS attack on Thursday 11 July, causing significant access issues and intermittent service disruptions, which the technical team is actively working to resolve.

Events

  • The State of the Map Working Group is happy to announce that ticketing and programme websites for SotM 2024 are now accessible. Early bird tickets are available at a discounted price until Wednesday 31 July.
  • Did you miss the call for general and academic presentations for the State of the Map 2024? You can still showcase your project or map visualisation by submitting a poster before Sunday 25 August. For inspiration take a look at the posters from SotM 2022.
  • The SotM France 2024 videos are now available on PeerTube .
  • The State of the Map US 2024 highlighted some new developments in pedestrian mapping, the integration of AI into mapping processes, and climate and historical data projects, with presentations on accessibility mapping, OpenStreetMap data validation, and participatory GIS for public land management.

Education

  • The IVIDES.org carried out a hybrid workshop on collaborative mapping with OpenStreetMap and Web mapping using uMap, for a group of geography students from the Federal University of Ceará (Brazil), Pici campus (Fortaleza) and the general public. Raquel Dezidério Souto wrote about this experience in her diary and the files and video are available in Portuguese.

OSM research

  • Lasith Niroshan and James D. Carswell introduced DeepMapper, an end-to-end machine learning solution that automates updates to OpenStreetMap using satellite imagery.

Maps

  • [1] TrailStash, ‘the home for #mapping projects by @dschep’, tooted that they have created a gallery of Overpass Ultra map examples.

OSM in action

  • Bristow_69 noted that the Dialogues en Humanités festival is using a nice OpenStreetMap-based map, but unfortunately has not given proper credit to OpenStreetMap.
  • EMODnet’s (European Marine Observation and Data Network) map viewer includes base and feature layers from OpenStreetMap.
  • NYC Street Map represents an ongoing effort to digitise official street records, bring them together with other street information, and make them easily accessible to the public. The app was developed with OpenMapTiles and OSM contributors’ data. Users can find the official mapped width, name, and status of specific streets and how they may relate to specific properties. It is possible see how the street grid has changed over time in a chosen area.
  • Ola Cabs have replaced Google with OSM in their Ola Maps navigation application. The change aimed to reduce costs and provide faster, more accurate searches and improved routing. This transition is part of Ola’s broader strategy to improve users’ experience and independence of navigation technology, which was first introduced in its electric vehicles with MoveOS 4 earlier this year.
  • UtagawaVTT maintains the web platform Opentraveller, where contributors can register their mountain bike and electric bike travel routes and consult online data.

Software

  • HOT has released the production version of fAIr, an assistant for mapping with AI, to a wider audience of OSM communities. The software has been tested and the production website is now accessible (login with your OSM account).
  • Adam Gąsowski has introduced his OSM Helper UserScript, designed to streamline the use of community-built tools by automatically generating relevant links based on what the user is looking at. Future plans include integrating AI for automated tagging and developing a browser extension for Chrome and Firefox.
  • Gramps Web, the open-source, self-hosted family tree application, has added a historical map layer based on OpenHistoricalMap.
  • The 20.1.0.1 beta release of Vespucci included numerous updates, such as the removal of pre-Android 5 code, improvements to error handling and memory management, enhancements to the property editor, and new features such as GeoJSON label support and layer dragging.

Programming

  • MapBliss is an R package for creating beautiful maps of your Leaflet adventures. It allows users to create print-quality souvenir maps, plot flight paths, control label positions, and add custom titles and borders. The package integrates several dependencies and is open for contributions and feature requests.
  • Mattia Pezzotti is documenting his progress in integrating Panoramax with OpenStreetMap as part of Google Summer of Code 2024, providing weekly updates on new features and improvements such as viewing 360-degree images, adding filters, and improving the user interface. This ongoing project was previously covered in weeklyOSM 723.
  • JT Archie described how they optimised large-scale OpenStreetMap data by converting it to a SQLite database, using full-text search and compression techniques, in particular the Zstandard seekable format, to handle data efficiently and improve query performance.

Did you know …

  • … the release of Taiwan TOPO v2024.07.04 continues the tradition of weekly updates started in September 2016? Taiwan TOPO provides detailed topographic data for Taiwan.

OSM in the media

  • In an op-ed in The New York Times, Julia Angwin criticised society’s overreliance on turn-by-turn navigation in Google Maps and calls for greater investment in OpenStreetMap as a public good.

Other “geo” things

  • The Ammergauer Alpen natural park has implemented a visitor monitoring system using sensors and GPS data to manage and protect natural areas while supporting sustainable tourism.
  • Geomob has tooted about the release of the episode #241 of their Geomob podcast, which covers a wide variety of issues, such as the distortion of some electoral maps and the use of drones in agriculture.
  • The Olympic torch relay route can be viewed on the Paris 2024 official website. The uMap Trajet Flamme Olympique 2024, created by @IEN52, shows all the 67 stages of the parcours, including overseas territories. Some other uMaps show the passage of the Olympic Torch in selected cities.
  • The Philippines’s Second Congressional Commission on Education and the Department of Education are partnering to conduct a comprehensive nationwide mapping of private schools starting this July. This initiative aims to inform government policies, optimise resource allocation, and enhance complementarity between the public and private education systems.
  • TomTom and East View Geospatial have partnered to provide Australia’s Department of Defence with global map data, leveraging TomTom’s Orbis Maps for accurate geospatial information critical to national security and disaster response. TomTom’s Orbis Maps is made by conflating open data from Overture and OSM with TomTom partners’ data and TomTom’s proprietary data in a controlled environment.
  • Marcus Lundblad has published his annual ‘Summer Maps’ blog post for 2024, with updates to map visualisations, improvements to search functionality and dialogue interfaces, the addition of a playground icon, support for public transport routing, and the introduction of hill shading for showing terrain topology.
  • Researchers at the Sun Yat-sen University, in collaboration with international experts, have detailed, in the Journal of Remote Sensing, a framework for building extraction using very high-resolution images in complex urban areas, addressing the limitations of existing datasets for urban planning and management.

Upcoming Events

Where What Online When Country
Salt Lake City OSM Utah Monthly Map Night 2024-07-11 flag
Lorain County OpenStreetMap Midwest Meetup 2024-07-11 flag
Amsterdam Maptime Amsterdam: Summertime Meetup 2024-07-11 flag
Berlin DRK Online Road Mapathon 2024-07-11 flag
Wildau 193. Berlin-Brandenburg OpenStreetMap Stammtisch 2024-07-11 flag
Zürich 165. OSM-Stammtisch Zürich 2024-07-11 flag
Bochum Bochumer OSM-Treffen 2024-07-11 flag
Bangalore East OSM Bengaluru Mapping Party 2024-07-13 flag
Portsmouth Introduction to OpenStreetMap at Port City Makerspace 2024-07-13 – 2024-07-14 flag
København OSMmapperCPH 2024-07-14 flag
Strasbourg découverte d’OpenStreetMap 2024-07-15 flag
Richmond MapRVA – Bike Lane Surveying & Mapping Meetup 2024-07-16 flag
England OSM UK Online Chat 2024-07-15 flag
Missing Maps London: (Online) Mid-Month Mapathon 2024-07-16
Bonn 177. OSM-Stammtisch Bonn 2024-07-16 flag
Hannover OSM-Stammtisch Hannover 2024-07-17 flag
Łódź State of the Map Europe 2024 2024-07-18 – 2024-07-21 flag
Zürich Missing Maps Zürich Mapathon 2024-07-18 flag
Annecy OSM Annecy Carto-Party 2024-07-18 flag
OSMF Engineering Working Group meeting 2024-07-19
Cocody OSM Africa July Mapathon – Map Ivory Cost 2024-07-20 flag
München Mapathon @ TU Munich 2024-07-22 flag
Stadtgebiet Bremen Bremer Mappertreffen 2024-07-22 flag
San Jose South Bay Map Night 2024-07-24 flag
Berlin OSM-Verkehrswende #61 2024-07-23 flag
[Online] OpenStreetMap Foundation board of Directors – public videomeeting 2024-07-25
Lübeck 144. OSM-Stammtisch Lübeck und Umgebung 2024-07-25 flag
Wien 72. Wiener OSM-Stammtisch 2024-07-25 flag

Note:
If you like to see your event here, please put it into the OSM calendar. Only data which is there, will appear in weeklyOSM.

This weeklyOSM was produced by Aphaia_JP, MatthiasMatthias, PierZen, Raquel Dezidério Souto, Strubbl, TheSwavu, YoViajo, barefootstache, derFred, mcliquid, miurahr, rtnf.
We welcome link suggestions for the next issue via this form and look forward to your contributions.

Teaching AI in Schools

Saturday, 13 July 2024 03:30 UTC

Artificial Intelligence (AI) is a hot topic these days, and it’s natural to wonder how it fits into education. In this article, we will explore the best practices, concerns, and recommendations for integrating AI into school curriculums. I will also provide references to useful tools and learning materials. Importance of AI education at schools Why is there a growing interest in teaching AI in schools? AI has become deeply integrated into society, creating new applications and possibilities while also introducing ethical concerns.

Women in yellow and orange garb

Introduction

In alignment with our commitment to communicate our funding outcomes, this article provides an overview of the direct support provided by the Wikimedia Foundation through grantmaking towards the Movement in the fiscal year of 2023/24. Our focus will be on the Wikimedia Community Fund program, which includes the regional budgets for General Support Funds (unrestricted and multi-year funding) and Rapid Fund programs, as well as the Conference and Event Fund.

This link showcases the list of all funded grantees across all regions with the General Support Fund and Conference Funds.

We will highlight trends observed across all of our funding regions and share lessons learned from experimenting with various funding options offered to the communities. The regions discussed are East, Southeast Asia and Pacific [ ESEAP], South Asia [SA], Northern and Western European Region [NWE], North America [NA], Middle East and North Africa [ MENA], Sub-Saharan Africa [SSA] and Latin America and the Caribbean [ LAC].

Main Takeaways – Trends

Alignment with the Movement Strategy

The Regional Funds Committee members and the Thematic Experts evaluating the General Support Fund proposals found that 92% of all applications globally were in line with the 2030 Movement Strategy, and only 8% required clarifications on their contribution to the strategy. 

Newcomers Funded in the GSF [General Support Funding]

In this fiscal year, Regional Funding Committees approved a total of 20 first-time applications to the General Support Fund; 4 from the East, Southeast Asia and Pacific Regions, 4 from South Asia, 1 from the Northern and Western European Region, 2 from North America, 5 from the Middle East and North Africa, 4 from Sub-Saharan Africa. The rest of the grants were approved as returning annual and multi-year grants.

Trends in GSF [General Support Funding] 

  • Increased Demand for Multi-Year Grant Funding: There’s increased uptake of multi-year grant funding in multiple regions: NWE, ESEAP, NA, MENA, LAC, CEE and slightly in SSA. Regional Funds Committees conducted a careful evaluation of each grantee’s strategic strength, organizational stability, reporting strength and previous impact, as well as budget restrictions before awarding multi-annual proposals in this fiscal year. 
  • Programming: In most regions (SA, ESEAP, NWE, LAC, CEECA, NA) proposals showed more intentional and consistent programming, focusing on consolidating ongoing activities and deepening partnerships rather than innovating. An observed trend was a lack of new or innovative strategies. For some grantees, this was due to the practicality of aligning new strategies with multi-year grant applications, or a commitment to continuing successful programs from the previous year, as these efforts have proven to be reliable.
  • Compared to last year, there was an increase in cases where Regional Funds Committees provided conditional funding to support strategic reviews of grantee partner programming and organizational or group structures, to enable the proposed work.
  • Professionalization Efforts: In some regions (esp. NWE, ESEAP, SA, SSA) the grantee community showed increased effort in professionalization, including improved practices for financial management, robust and innovative governance structures, embarking on or improving fundraising initiatives and efforts, as well as working on becoming an autonomous registered entity in their country. 
  • Addressing Volunteer Burnout: There is a global trend of addressing volunteer burnout by bringing on board-paid staff to hold roles typically held by volunteers. 
  • Thematic Focus: Culture and heritage remain the main thematic focus in NWE and ESEAP, while education and climate change were leading in MENA and LAC. Grantees in these last regions are expanding to education, technology, and indigenous communities. Youth engagement remains a priority in the ESEAP region and is becoming a central focus in the CEE region through robust educational initiatives.

    Additionally, some regions are targeting niche audiences. In South Asia, there is a focus on documenting local fauna and indigenous languages and developing technical communities. Meanwhile, in North America, newly funded organizations are concentrating on language revitalization and providing access to high-quality medical information for practitioners and patients.
  • Diversity and Inclusion: This fiscal year was a win for Diversity and Inclusion: across all regions, applications showed a deep reflection on equity and inclusion. All affiliates have identified content and representation gaps to actively work on bridging these gaps. We see a continued focus in grantees on bringing on board women as part of their efforts to diversify the participant and governance demographic, as well as bringing in rural and remote communities and minority groups (e.g. indigenous, neurodivergent, or gender-diverse groups). 
  • Focus on communities beyond domestic borders: Many grantees in the NA and NWE regions extend their support to communities outside their regions. For example, Whose Knowledge’s outreach work with Dalit, queer, and indigenous communities in Africa, Latin America, South Asia, and Southeast Asia. WM Sverige (Sweden) provides global support for GLAM initiatives, while WM France and Les sans PagEs focus on supporting francophone communities worldwide.
  • Strengthened Regional Funding Committees: This financial year, Regional Funds Committees exercised greater agency in their decision-making, reflecting on their experiences and clearly articulating their support needs. In collaboration with the Wikimedia Foundation’s DEI team, the Committee members enhanced their skills in diversity, inclusion and equity, and improved their use of language around neurodiversity. Additionally, opportunities were identified for their inclusion in the regional grants budgeting process, further empowering their role.

Identified areas of improvement 

  • Enhancing the visibility of the work being done by grantee partners to multiple stakeholders for different purposes, utilizing existing tools and platforms such as the outreach dashboard, Diff, Meta, and social media.
  • Fostering a Sense of Collaboration: Committees recommended grantees pursue opportunities for collaborative work amongst each other and with external partners. This includes working together on programs, campaigns, projects, communication efforts, and institutional learning.
  • Focus on Sustainability: Financial and organizational/programming sustainability were key areas of focus in all regions. Committees often questioned the sustainability of certain approaches proposed by applicants, leading to recommendations on deepening existing efforts rather than scaling up. This included diversifying activities, developing necessary capacities (including on-wiki skills), leveraging volunteers to support organizational efforts, and utilizing leadership development tools.
  • Capacity Building in Fundraising: A clearly defined area for capacity building was to support external fundraising for affiliates, including assessing grantee skills, capacity building and defining the appropriate level of WMF interventions in this area. There are current efforts by the Wikimedia Foundation to get a contractor to support affiliates directly in building this capacity. 


Trends in Rapid Funds

The SSA region continued to have the highest uptake of rapid funds, while the ESEAP, LAC, and SA regions saw a steady increase. In contrast, NA and MENA showed less interest, partly due to government restrictions in Egypt and Sudan and a lack of awareness about funding cycles. In NWE, applications were predominantly from high-income countries, prompting the Committee to prioritize applications from lower-income countries in the future. 

Trends in Conference Fund

About 36% of the grantee partners this year were first-time applicants to the Conference Fund program. We have funded 5 regional events; 5 Thematic; and 6 ‘Growth’ events. Many of them were new initiatives that were never organized before such as the WikiOutdoor Training; Climate Justice, Indigenous Voices and Open Knowledge, and the Global Wiki Advocacy Meet-up.

Building a better support system for community events organizers continues to be a priority, therefore, one of the most impactful changes that were piloted and will be fully implemented starting this fiscal year is that the Travel and Convening team will now handle communication and contracting for all travel and accommodation arrangements for all the regional events and some of the Thematic events as well. We expect this to not only maximize the budget allocated to community events but also take much of the organizing burden off the community Organizing teams so they can focus more on the core wiki- work and programming.

Within this link, you will find a detailed list of every funded grantee partner, including the level of funding requested and whether they are newcomers or returning grantees.

We trust that this review of our funding outcomes and lessons learned has been insightful. As we move into FY24/25, we remain committed to continue supporting the movement, informed by the insights we’ve gathered across the years. Next week, we will share another update on the FY24/25 budget, outlining the process used and lessons learned, particularly from involving regional funding committees and the executive director’s working group.

For any questions or feedback, don’t hesitate to get in touch with us at communityresources@wikimedia.org.

Coordinate Me: free data in competition

Friday, 12 July 2024 07:00 UTC

3228 registered participants and 129,102 edited Wikidata items: that’s the result of Coordinate Me 2024, the largest international Wikidata contest to date. The competition took place in May 2024 and was organized by Wikimedia Austria. What did we learn?

What was it about?

The goal of the competition was to improve Wikidata items that contained geodata and were assigned to one of the 16 focus countries. People from all over the world were invited to take part. When selecting the focus countries, we aimed for diversity: both geographically and in terms of the size of the local Wikimedia communities. In Africa, we opted for Niger, Nigeria and Senegal, and in South America for Argentina, Chile and Uruguay. In Europe, we chose Austria, Germany, France, Malta, the Netherlands and Spain. Australia, New Zealand, Canada and India were also included.

To automatically count the masses of contributions, we used the Programs & Events Dashboards by Wiki Education in combination with PetScan and SPARQL. (We had already talked about the technical specifications in a Volunteer Supporters Network training in 2022). Despite dedicated efforts on the part of Wiki Education, the Dashboards repeatedly failed completely. As a result, we were only able to carry out the final evaluation two and a half weeks later than planned.

Wikidata Workshop in Vienna 2018
Wikidata workshop supported by Wikimedia Österreich in Vienna 2018

Local anchoring

As part of a global movement, we were able to benefit from local knowledge and networks. In addition to Wikimedia Austria, eleven Wikimedia partner organizations were involved. Wikimedia Deutschland provided translations of the competition pages into German and a grant of one ninth of the prize money, the rest of which was covered by Wikimedia Austria. Wikimédia France, Wikimedia España and Wikimedia Nederland were responsible for translating the competition pages into their main national languages. Wikimedia Argentina, Wikimedia Canada, Wikimedia Chile, Wikimédiens du Niger User Group, Wikimedia User Group Nigeria, Wikimedia Community User Group Sénégal and Wikimedistas de Uruguay were also involved, particularly in the promotion in their local communities.

For all focus countries, we provided lists with suggestions on what the competition participants could work on. In addition to better links to external databases and the promotion of multilingualism, the geolocalization of objects was naturally a priority. Wikimedia Argentina used this opportunity to suggest improving recently uploaded datasets on more than 500 sculptures in the city of Resistencia. 

Of the 129,102 edited Wikidata items, 15,261 were new items. Most items were edited for India, Australia and Germany – but even Malta, the country with the fewest edits, had more than 100 improved items. So there was interest in all focus countries.

Who was reached?

The decisive factor in promoting the competition was a central notice in the Wikimedia projects that could be seen in the focus countries. The target pages linked in the central notice recorded a total of four million page views. However, organizing the banner placement proved to be particularly nerve-wracking this time – not only for us, but also for some volunteers who we had to ask for help late into the night. We need a reliable and fair process for our central notice system, probably our most important outreach tool.

Of the 3228 registered participants, 1449 (45 %) had new user accounts. The proportion of new participants is therefore higher than in some photo competitions, which we know are also of interest to newcomers. We were prepared for this with appropriate additional support.

Wikidata as event

The competition was accompanied by a series of online and hybrid events. There were free workshops for Wikidata newcomers in German, French and Spanish. We organized these with the support of the international Volunteer Supporters Network. The three workshops were attended by a total of 37 interested people. 

The Wikimedistas de Uruguay also offered a workshop in Spanish for the OpenRefine tool, which can be used to edit Wikidata on a massive scale. At a Datathon, a group spent an afternoon editing data sets on the Upper Austrian municipality of Molln. This also supported a photo project for Wikimedia Commons, which will take place in Molln this summer.

The most diligent at the end

In the end, prizes with a total value of 4,500 euros were awarded to the 30 participants with the most edited Wikidata items. These are (in alphabetical order): 99of9, Abike25, Akintundedaniel, Arjunaraoc, Brookschofield, Canley, Cookroach, Geiserich77, Gnoeee, GPSLeo, Gwanki, Isiwal, Jessephu, JFVoll, Kalepom, Lodewicus de Honsvels, Madamebiblio, Michael w, Prosperosity, Pymouss, Rudermeister, Saiphani02, Salil Kumar Mukherjee, Sriveenkat, Uniwah, Vanbasten 23, Werthercito, Z thomas, আফতাবুজ্জামান and ᱤᱧ ᱢᱟᱛᱟᱞ.

Congratulations and good recovery until our next Wikidata competition!

Wikidata logo in heart shape

A number of tools hosted on Toolforge rely on the replicated MediaWiki databases, dubbed "Wiki Replicas".

Every so often these servers have replication lag, which affects the data returned as well as the performance of the queries. And when this happens, users get confused and start reporting bugs that aren't solvable.

This actually used to be way worse during the Toolserver era (sometimes replag would be on the scale of months!), and users were well educated to the potential problems. Most tools would display a banner if there was lag and there were even bots that would update an on-wiki template every hour.

A lot of these practices have been lost since the move to Toolforge since replag has been basically zero the whole time. Now that more database maintenance is happening (yay), replag is happening slightly more often.

So to make it easier for tool authors to display replag status to users with a minimal amount of effort, I've developed a new tool: replag-embed.toolforge.org

It provides an iframe that automatically displays a small banner if there's more than 30 seconds of lag and nothing otherwise.

As an example, as I write this, the current replag for commons.wikimedia.org looks like:

The replica database (s4) is currently lagged by 1762.9987 seconds (00:29:22), you may see outdated results or slowness. See the replag tool for more details.

Of course, you can use CSS to style it differently if you'd like.

I've integrated this into my Wiki streaks tool, where the banner appears/disappears depending on what wiki you select and whether it's lagged. The actual code required to do this was pretty simple.

replag-embed is written in Rust of course, (source code) and leverages in-memory caching to quickly serve responses.

Currently I'd consider this tool to be beta quality - I think it is promising and ready for other people to give it a try, but know there are probably some kinks that need to be worked out.

The Phabricator task tracking this work is T321640; comments there would be appreciated if you try it out.

ഭാഷ തടസ്സമാകാതിരിക്കാൻ സഞ്ചാരികളെ സഹായിക്കാൻ AI Kiosk കൾ സ്ഥാപിക്കും എന്ന മന്ത്രി മുഹമ്മദ് റിയാസ് നിയമസഭയിൽ പറഞ്ഞെന്ന് പത്രത്തിൽ വായിച്ചു. നിർമിതബുദ്ധിയിൽ പ്രവർത്തിക്കുന്ന കിയോസ്കുകൾ അവർക്ക് അവരുടെ ഭാഷയിൽ മറുപടി കൊടുക്കുമെന്നാണ് മന്ത്രി പറഞ്ഞത്. ഭാഷ തടസ്സമാകാതിരിക്കാൻ സഞ്ചാരികളെ സഹായിക്കാൻ AI Kiosk കൾ സ്ഥാപിക്കും -ദേശാഭിമാനി പത്രം - ജൂലൈ 12, 2024 ചില ചോദ്യങ്ങൾ ഏതെങ്കിലും വിനോദസഞ്ചാരകേന്ദ്രത്തെക്കുറിച്ച് നിലവിൽ സഞ്ചാരികൾ അറിയുന്നതും സംശയങ്ങൾ തീർക്കുന്നതും എങ്ങനെയാണ്? അതിൽ എന്ത് പോരായ്മകളാണ് ഉള്ളത്? ഇന്റർനെറ്റ് കണക്ഷനുള്ള മൊബൈൽ ഫോണുകളിൽ ലഭ്യമല്ലാത്ത എന്തു സൗകര്യമാണ് ഈ കിയോസ്കുകളിൽ ഉണ്ടാകുക? ഇന്റർനെറ്റിൽ ലഭ്യമല്ലാതിരിക്കുകയും എന്നാൽ കിയോസ്കുകളിൽനിന്നു മാത്രം അറിയാൻ കഴിയുന്നതുമായ എന്തെങ്കിലും വിവരങ്ങൾ ഉണ്ടോ?

This Month in GLAM: June 2024

Friday, 12 July 2024 02:31 UTC

Wikipedia Town Inazawa

Thursday, 11 July 2024 19:20 UTC

July 7, 2024, I (User:Asturio Cantabrio) participated in “Wikipedia Town Inazawa”, held at the Inazawa City Central Library in Inazawa City, Aichi Prefecture, Japan.

Organizers and Participants

The event was organized by the Inazawa City Hall Commerce and Tourism Division. The lecturer was Professor Kazuto Aoki from the Fukui Prefectural University Regional Economic Research Institute.

There were 16 participants, and in addition to residents of Inazawa City, there were also participants from Ama City and Nagoya City. The participants were Inazawa City Hall staff, Aichi Prefectural Government staff, Tourism Association staff, public library librarians, university librarians, and school librarians.

The destinations on the walk were the Chuko Memorial Hall, the Owari Kokuga Site, Akazomeemon Poetry Monument Park, and the Owari School Site. The Chuko Memorial Hall is a modern Western-style building completed in 1880, while the other three are historical sites related to Akazomeemon, a poet of the Heian period, and his husband, Oe no Masahira.

Chuko Memorial Hall

The Chuko Memorial Hall is the oldest modern Western-style building in Inazawa City, and “Chuko” means Nakajima County Higher Elementary School.
The interior of the Chuko Memorial Hall is open to the public about once a year, so even modern architecture fans rarely have the opportunity to enter the building.


The Chuko Memorial Hall was completed in 1880, but it is unclear what purpose it served in the seven years before it became Nakajima County Higher Elementary School in 1887.

In addition, this building was first built near Zengen-ji Temple, and then moved three times, in 1912, 1940, and 1960, before settling in its current location. Over the course of 144 years, it has been used for various purposes, including a school, town hall, agricultural association, and board of education.


There is no page on the web that summarizes this complex history. Also, the explanations in the paper versions of “Repair Work Report for Chuko Memorial Hall” and “History of Inazawa City” are somewhat difficult to understand. In that sense, I think that creating an article on Chuko Memorial Hall on Wikipedia at this event was very meaningful.

Editing Workshop

After discussing the topic in advance, the organizers and instructors decided to split into three groups to edit Wikipedia articles: “Chuko Memorial Hall” (new article), “Akazoemon Poetry Monument Park” (new article), and “Owari Province” (addition to text). I acted as the facilitator for the Chuko Memorial Hall group.

However, the three Wikipedians in each group agreed that “creating a new article for Akazomeemon Poetry Monument Park would be difficult from the perspective of the guidelines for creating independent articles.” Therefore, the “Akazomeemon Poetry Monument Park” group added articles to “Akazome Emon” rather than creating a new article for the park. The “Owari Province” group focused on adding articles to “Oe no Masahira” rather than adding articles to Owari Province.

Literature used in editing

The Inazawa City Central Library, which was also the venue, prepared the literature used for editing. However, I assumed that the library would only prepare stiff books, so I personally printed out newspaper articles related to the topic from the Chunichi Shimbun and Asahi Shimbun databases and brought them to the event.

Such newspaper articles provide a concise and easy-to-understand summary of the subject. They are also useful for understanding the influence of the subject in modern times. For example, in the book, the Junior and Senior High School Memorial Hall is only described as a building that was once used as an elementary school, but when we read the newspaper article, we learn that it is a building that is still used for exhibitions and other events.

It seems that some groups viewed documents in the National Diet Library Digital Collection during the event and added sources. I don’t know if the Inazawa City Central Library will read the edited article, but it was an interesting event in which many documents that the organizers did not prepare in advance were used.

Geneva, Switzerland — Yesterday, the Wikimedia Foundation, the nonprofit that hosts and supports Wikipedia and other Wikimedia projects, was again denied accreditation as a permanent observer to the World Intellectual Property Organization (WIPO) — the specialized United Nations (UN) agency that determines global policies on copyright, patents, and trademarks for its 193 Member States. 

Observer status would enable the Wikimedia Foundation to participate and contribute to WIPO committees where intellectual property norms are set. For the fourth time, China opposed the Foundation’s request for observer status, based, once again, on false accusations that the Foundation is complicit in spreading disinformation. China misrepresented Wikipedia’s volunteer-driven policies and practices, all of which are rooted in accuracy and neutrality and help effectively counter misinformation and disinformation online.

As the host of the world’s largest online encyclopedia, the Wikimedia Foundation has a material interest and deep, practical expertise in many of the issues of interest being discussed at WIPO, including traditional knowledge, copyright, access to knowledge during times of crises, and Artificial Intelligence (AI). The Foundation’s presence at WIPO would help to ensure that the future of copyright truly reflects the global and diverse needs of the internet. Given that the content on Wikipedia and other Wikimedia projects also play an essential role in training almost every large language model (LLM), the Foundation can offer valuable recommendations and unique insights as WIPO strives to understand and respond to the impact of AI on intellectual property rights.

“In the age of AI, Wikipedia is at the forefront of global copyright debates. Our experience at the Wikimedia Foundation can help WIPO Member States achieve meaningful policy transformations to protect open knowledge and content creation for the public interest,” said Stephen LaPorte, General Counsel of the Wikimedia Foundation. “We regret that the Foundation has once again been denied the opportunity to participate as observers at WIPO, especially on the basis of erroneous statements. We call on WIPO leadership to find a solution that can resolve this deadlock. Until then, we will continue to seek opportunities to represent open knowledge and the public interest at WIPO and beyond. Since 2022, our consultative status at the UN Economic and Social Council (ECOSOC) has allowed us to actively contribute to global initiatives like the Global Digital Compact, and we hope to one day share our expertise with WIPO as well.”

For 21 years, the Wikimedia Foundation has continuously contributed to country-level legislative processes on intellectual property, stressing the importance of balanced copyright laws for hosting content on Wikipedia and any other free and open online spaces designed for the public interest. Moreover, in times of crisis, conflicts, and pandemics, Wikimedia projects provide critical and reliable information that must remain available and be protected in forums like WIPO. 

The Foundation applied as a permanent observer to WIPO in 2020, 2021, 2023, and again this year, 2024. Our application was once again denied during WIPO’s General Assembly meeting based on a lack of consensus caused by China’s opposition. China has also previously blocked applications from Wikimedia affiliate groups and chapters seeking permanent or ad hoc observer status in WIPO. The Netherlands, as coordinator of the WIPO group of industrialized countries (which includes Australia, Israel, Japan, New Zealand, Norway, Turkey, the Holy See, and many European Union member states), the United States (US), France, Canada, Switzerland, and the United Kingdom (UK) expressed public support for the Foundation’s application. Supporting countries highlighted the Foundation’s valuable insights and experiences, demonstrating its involvement in global copyright issues and relevance to WIPO’s work. 

The Wikimedia Foundation is an active and respected contributor and shaper of policies and practices concerning access to knowledge and information around the world. We hope that UN Member States and WIPO leadership will act to help advance global access to free knowledge by enabling the Foundation’s observer status application to move forward in the near future.

About the Wikimedia Foundation

The Wikimedia Foundation is the nonprofit organization that operates Wikipedia and other Wikimedia free knowledge projects. Our vision is a world in which every single human can freely share in the sum of all knowledge. We believe that everyone has the potential to contribute something to our shared knowledge and that everyone should be able to access that knowledge freely. We host Wikipedia and the Wikimedia projects, build software experiences for reading, contributing, and sharing Wikimedia content; support the volunteer communities and partners who make Wikimedia possible. The Wikimedia Foundation is a United States 501(c)(3) tax-exempt organization with offices in San Francisco, California, USA.

For media inquiries, please contact press@wikimedia.org

The post Wikimedia Foundation’s Accreditation to World Intellectual Property Organization Blocked for a Fourth Time by China appeared first on Wikimedia Foundation.

Anne-Christine Hoff is an associate professor of English at Jarvis Christian University.

Back in January of this year, I took a three-week, six-hour introductory course on Wikidata through the nonprofit Wiki Education. Before the course’s start, I knew little to nothing about Wikidata, and I had several preconceived notions about the database and its uses before I began the course.

My first impression about Wikidata was that AI bots ran the system by sweeping Wikipedia pages and then used that information to create data sets under various pre-defined headings. In my conception, Wikidata’s information updated only when editors on Wikipedia changed or added pages. I thought of Wikidata as a closed system, and I thought the point of the course would be to learn how to run queries, so that we students could figure out how to access the data collected through Wikipedia. 

I remember asking my Wiki Education instructor about the role of AI in Wikidata, and he very pointedly responded that bots cannot program anything on their own. Instead, humans program Wikidata, and through this programming capability, both humans and machines can read and edit the system.

Anne-Christine Hoff
Anne-Christine Hoff
Image courtesy Anne-Christine Hoff, all rights reserved.

Wired writer Tom Simonite provided an example of this phenomenon in his article “Inside the Alexa Friendly World of Wikidata”:

“Some information is piped in automatically from other databases, as when biologists backed by the National Institutes of Health unleashed Wikidata bots to add details of all human and mouse genes and proteins.” 

This same article also discusses a further example, published in a paper by Amazon in 2018, of Wikidata teaching Alexa to recognize the pronunciation of song titles in different languages.

Both of these examples do a good job of illustrating another one of my misconceptions about Wikidata. As mentioned before, I thought the system was centralized and, apart from periodic updates, static. I did not conceive of the difference between data collected through documents (like Wikipedia) and a database with an open and flexible, relational communication system. 

What I discovered was vastly more interesting and complex than what I imagined. It was not a bot-driven data collecting system drawn from Wikipedia entries, but instead Wikidata was a communication system that can use multiple languages to add data. An editor in Beijing may enter information in Chinese, and that data will immediately be available in all the languages used by Wikidata. This feature allows for a self-structuring repository of data by users adding localized data from all over the world.

In 2013, Wikidata’s founder, Denny Vrandečić, wrote about the advantages that a database like Wikidata has over documents because “the information is stored centrally from where it can be accessed and reused independently and simultaneously by multiple websites without duplication.” In his article “The Rise of Wikidata,” Vrandečić made clear that Wikidata is not just a database for Wikipedia and other Wikimedia projects. It can also be used “for many different services and applications, from reusing identifiers to facilitate data integration, providing labels for multilingual maps and services, to intelligent agents answering queries and using background knowledge” (Vrandecic, 2013, p. 90). 

This raises the question as to how Wikidata intelligently reads the information stored on its platform. My first misconception had to do with my belief that Wikidata was a flat collection of data based on Wikipedia’s entries. What I didn’t understand is that the crux of Wikidata’s intelligence comes from its ability to understand data in a relational way. As noted in “Familiar Wikidata: The Case for Building a Data Source We Can Trust,” Wikidata’s semantic structure is based on rules, also known as Wikidata ontology. According to this ontology, a person may have a relationship to a “born in” place, but a place cannot have a “born in” relationship to other entities. For example, Marie Curie can be born in Warsaw, but Warsaw cannot be born in Marie Curie. 

This knowledge-based structure is the key to understanding how Wikidata’s identifiers are used to connect to one another. In Wikidata’s logical grammar, two entities connect to one another by a relationship, also known as a “triple.”  It is this triple structure that creates the structural metadata that allows for intelligent mapping.  A fourth item, a citation, turns each triple into a “quad.” The fourth item is crucial to Wikidata’s ability to further arrange the data relationally, by making clear where the data in the triple originates, then arranging the data hierarchically based on its number of citations. 

Having access to the Wiki Education dashboard, I was able to see the edits of the other students taking the class. One student whom I’ll call Miguel was adding missing information about Uruguayan writers on Biblioteca Nacional de Uruguay’s catalog. As of this writing, he has completed more than 500 edits on this and other subjects, such as the classification of the word “anathema” as a religious concept. Two Dutch archivists were adding material on Dutch puppet theater companies in Amsterdam and Dutch women in politics. An Irish student was updating information on a twelfth century Irish vellum manuscript and an English translation of the Old Irish Táin Bó Cúailnge by Thomas Kinsella. 

What I saw when I perused the subjects of edits was exactly what the article “Much more than a mere technology” mentions, that is, that Wikidata is capable of linking local metadata with a network of global metadata. This capability makes Wikidata an attractive option for libraries wanting to “improve the global reach and access of their unique and prominent collectors and scholars” (Tharani, 2021). 

Multiple sources contend that Wikidata is, in fact, a centralized storage database, and yet the intelligence of Wikidata makes this description ring hollow. It is not a database like the old databases for documents. Its ontological structure allows for it to understand the syntax of data and arrange that information relationally into comprehensible language. Like the example of the biologists from the National Institutes of Health who programmed bots who programmed Wikidata bots to add genetic details about humans, mice and proteins to external databases, it can also be programmed for uses on external databases. Its linking capabilities make it possible for librarians and archivists from around the world to connect their metadata to a network of global metadata. Its multilingual abilities have a similar decentralizing effect, allowing users to create structured knowledge about their own cultures, histories, and literature in their own languages. 

If you are interested in taking a Wikidata course, visit Wiki Education’s course offerings page to get started.


Explore the upcoming Wikidata Institute, Wikidata Salon, and other opportunities to engage with Wikidata at learn.wikiedu.org.

Trouble with some wikis

Wednesday, 10 July 2024 15:26 UTC

Jul 10, 15:26 UTC
Resolved - This incident has been resolved.

Jul 10, 15:18 UTC
Monitoring - A fix has been implemented and we are monitoring the results.

Jul 10, 15:05 UTC
Investigating - We are aware of issues with accessing some wikis, and we are investigating.


A statement from Wikimedia Australia
.


Wikimedia Australia (WMAU) and the WMAU Board would like to acknowledge and give thanks to the Movement Charter Drafting Committee (MCDC) for their hard work over many years to produce the current Movement Charter and the Supplementary Documents. WMAU strongly supports the need for a Movement Charter as a Movement Strategy priority and appreciates the huge contribution the MCDC have made towards achieving this.

WMAU strongly endorses the aims of the Movement Strategy to increase diversity and equity in representation and inclusive decision-making across the global Wikimedia community. Current centralisation of power in the Wikimedia Foundation and the 12 WMF Board of Trustees Members is not representative or equitable, and is no longer appropriate for a global public interest platform.

Despite the significant time and effort already invested in the Charter process, the WMAU Board does not believe this is reason enough to ratify the proposed model as is. Although the Movement Charter is moving in the right direction, the WMAU Board is concerned that the model as proposed leaves open too much potential for unintended consequences.

WMAU’s chief concerns are that the proposed model:

  • is complex and bureaucratic
  • does not provide appropriate mechanisms for review, evaluation and iteration
  • does not provide adequate mechanisms for oversight and ensuring transparency and accountability
  • does not make it clear how diversity, inclusion and representation will be achieved
  • does not adequately communicate the separation of responsibilities between the Global Council, Global Council Board, the Wikimedia Foundation and the WMF Board of Trustees, resulting in a lack of clarity in relation to the operation of the Global Council and the Global Council Board.

As a result the WMAU Board feel they cannot vote yes in good conscience. It is for these reasons the WMAU Committee has opted to abstain by making a blank vote.

We did not come to this decision lightly. We discussed the proposed model at length within the Board and with our Chapter membership at a public meeting. In reaching this decision, the WMAU Board wants to make it clear that we and the Chapter remain committed to supporting and promoting diversity, inclusion and representation in the Wikimedia community, and we support ongoing moves towards more equitable and inclusive decision-making with respect to all Wikimedia Movement Organisations. We support a renewed effort to improve the current Charter. To that end, we recommend the MCDC consider separating ratification of the Principles and the parts of the Charter outlining the roles of existing Movement Bodies from the far more ambitious proposal to set up a Global Council and Global Council Board. The WMAU Board endorses the Charter Principles and Values and welcomes the clarity the Charter provides on the roles of various Movement Bodies. Our concerns relate to the need for more consideration of the constitution, representation, resourcing, voting, transparency, accountability and amendment processes of the Global Council and the Global Council Board.

In particular, we are extremely concerned that the model is deliberately difficult to amend, with unclear review or evaluation processes. This is a major issue given the complexity of the structure that is being proposed. As others have noted, this directly contradicts the Recommendation of the Movement Strategy #10 Evaluate, iterate, and adapt. We would like to see a model that is more adaptable and open to oversight, evaluation and review to reduce the risks associated with introducing a complex and bureaucratic new layer of governance such as the Global Council.

Beyond the proposed Charter itself, the WMAU Board wishes to flag concerns with the ratification process as well. Legitimate questions can be raised as to the role of the WMF Board of Trustees in the ratification process. Specifically, we note that the voting arrangement effectively gives the WMF Board of Trustees a veto over the passage of the Charter. Regardless of how that is wielded, it undermines the legitimacy of the spirit of community based decision-making the Charter seeks to enact.

We are also concerned that the Board Liaisons Reflections published on Friday 21 June 2024 had a negative impact on the Charter ratification process. Whether intended or not, the Board Liaisons unduly influenced community discussion of the Charter (and likely how votes were cast) by publicly stating their recommendation that the WMF Board of Trustees not ratify the Charter because the release of that recommendation could reasonably be read as an announcement of how the WMF Board of Trustees intended to vote (whether their vote followed the recommendation or not does not matter). This action was counter to the MCDC’s request that the WMF Board of Trustee’s vote not be shared until after the vote of individuals and affiliates had concluded to avoid influencing the voting. Unfortunately, the release of the Board Liaisons' recommendation has been widely construed as a deliberate attempt to influence the vote. Whether that was the intention, it has been both the effect and the perception.

WMAU looks forward to working together with the different stakeholders on next steps in the ongoing journey towards better governance and decision-making for the global Wikimedia community.  

Wikimedia Australia Board

Documenting manhole covers in Spain

Tuesday, 9 July 2024 05:13 UTC

Fremantle

· Wikimedia · photography ·

A fascinating journey: 10 years of manhole cover photography from our community, 8 July 2024 by Sara Santamaria:

Documenting a manhole cover has become an essential part of the community’s trips and outings. Over the years, some members have developed an affinity for certain covers that they consider particularly representative. Mentxu Ramilo, for example, found a 1925 manhole cover in Vitoria-Gasteiz that she found fascinating. “I let myself be infected by the Wikimedian spirit and passions, and by everything that forms part of the graphic heritage and deserves to be documented,” explains Mentxu.

I think we of WikiClubWest are going to have to up our game of cataloguing of all the street things! :-)

Tech News issue #28, 2024 (July 8, 2024)

Monday, 8 July 2024 00:00 UTC
previous 2024, week 28 (Monday 08 July 2024) next

Tech News: 2024-28

weeklyOSM 728

Sunday, 7 July 2024 10:03 UTC

27/06/2024-03/07/2024

lead picture

SotM France 2024 – Lyon [1] | © OSM-France

Mapping

  • Marco Antonio mapped El Cardón National Park in Bolivia using official boundary data from PROMETA, an environmental conservation organization of Tarija, Bolivia.
  • Roxystar is currently mapping street lamps in Munich, complete with additional details such as the lamp’s height, to simulate the light coverage by using OSMStreetLight.
  • rtnf on Mastodon emphasised the importance of mapping building entrances to help people avoid getting lost, citing personal experience of having to circle a building to find the entrance. znrgl points out in the conversation that it is easy to record entrances with the Every Door at any time while traveling.
  • DENelson83 has completed a project to manually map all the forested areas on Vancouver Island from aerial imagery, improving the detail and accuracy of the island’s forested regions on OpenStreetMap.
  • Comments were requested on the following:
    • The proposal to deprecate crossing=zebra in favour of crossing:markings.
    • The proposal to introduce the volunteers: prefix for locations/features that have need of volunteers, including whether new volunteers are accepted, urgency of need, signup information, and benefits for volunteers.

Mapping campaigns

  • The Open Mapping Hub – Asia Pacific from HOT celebrated the winners of the Climate Change Challenge, recognising the efforts to generate valuable data through OpenStreetMap in 14 Asia Pacific countries. Special thanks were given to Open Mapping Gurus from Nigeria, Peru, and Niger, and the winning teams will soon receive their prizes. Countries mapped include Indonesia, India, the Philippines, Nepal, and more.
  • Pavy_555 visited JNTU Hyderabad, to promote smart mobile mapping using the Every Door app, emphasising community engagement and the importance of updating OpenStreetMap data with local amenities and micro-mapping efforts.
  • IVIDES.org is promoting a campaign > for the collaborative mapping of the Brazilian coastal and marine zones. The project uses OpenStreetMap and will be carried out to evaluate aspects related to the sustainability of this strategic region. Registration is open for participation in the pilot mapping and the research coordinator presents the initiative in her diary > .

Community

  • The OpenStreetMap community is invited to participate in WikiCon 2024, taking place from 4 to 6 October in Wiesbaden, Germany. Volunteers are needed to staff the OSM booth and promote the project to a wider audience. Travel and accommodation costs can be covered by FOSSGIS e.V. for participants from outside the Wiesbaden or Rhein-Main area. If you are interested, you can note this directly on the wiki page.

Events

  • [1] Bristow presents a photo retrospective of the 10th SotM France conference, held in Lyon from 28 to 30 June 2024. Attendance records were broken, with over 300 people taking part. Recordings of the presentations will soon be available online on PeerTube.
  • The deadline for early bird pricing for the 2024 State of the Map from 6 to 8 September has been extended till 31 July.
  • The FOSS4G Perth 2024 conference, scheduled for 23 October in conjunction with the ISPRS TC IV Mid-Term Symposium, has opened its Call for Presentations, inviting the open geospatial community to share insights on tools such as QGIS, PostGIS, and OpenStreetMap.
  • The State of the Map 2024 programme offers a diverse range of sessions, workshops, and lectures. The event will occur from 6-8 September, in Nairobi, Kenya, covering topics such as sustainable transport, local mapping initiatives, integration into academic curricula, and innovative data collection methods.

Education

  • OpenStreetMap contributor Denis_Helfer is organising an introduction to OSM on the 15 July in Strasbourg, France. This will likely be followed by a series of workshops in autumn.

Maps

  • JveuxDuSoleil is a web application that simulates urban shadows to help users find sunny terraces in cities such as Paris, Marseille, and Nantes. Users can zoom in on the map to see where terraces will be sunny at certain times. However, the project faces functionality issues as building models and their shadows are no longer generated due to maintenance issues.

OSM in action

  • The ‘Los Pueblos más Bonitos de España’ website offers a guide to the most beautiful villages in Spain, with resources such as an OpenStreetMap-based village map application for geolocalised travel and a guidebook for sale to help organise trips to these charming places.
  • The GLOBE programme’s data visualisation tool allows users to explore environmental data collected around the world, filtering by protocol, date range, and geographical location, with options to download and analyse specific datasets for educational and scientific purposes.
  • The Toll/ST Ceritapeta tool allows users to visualise and measure driving distances from various toll gates and train station in Jakarta, Indonesia on an OpenStreetMap background. This tool is utilized to aid decision-making when choosing a residential complex in the suburbs of the Jakarta Metropolitan Area, as driving distances to the nearest transportation infrastructures serve as a good indicator of connectivity.
  • The Naturkalender ZAMG map allows users to explore various natural observations, such as plant and animal phenology data. It provides detailed visualisations of seasonal changes and species distribution, supporting citizen science, and ecological research.
  • The Mosquito Alert map displays real-time reports of mosquito sightings and breeding sites submitted by users on an OSM background, contributing to public health research and control efforts. The interactive map allows users to explore mosquito data geographically, helping to track the spread and presence of different mosquito species.
  • Norbert Tretkowski navigated > around Norway using Organic Maps on a Google Pixel 3, detailing the app’s performance and challenges with features such as tunnel navigation, estimated arrival times, and ferry integration.
  • velowire.com displays the routes of the most important cycle races on OpenStreetMap maps and offers them for download.
  • NNG and Dacia have partnered to offer Dacia drivers OSM based navigation maps, providing a community-driven, frequently updated, and feature-rich map solution to enhance the driving experience.

Open Data

  • The Heidelberg Institute for Geoinformation Technology (HeiGIT) has made OSM land use data available on HeiData, providing TIFF tiles for EU countries and the UK. This data is derived from Sentinel-2 imagery and OpenStreetMap, which is classified into categories such as agricultural areas and urban regions using a deep learning model. The datasets can be used by urban planners, environmental researchers, and others for various applications.

Software

  • Badge(r)s is a location-based GPS game where players collect virtual items, quadrants, and regions, acting as both creators and collectors. Badges, the primary virtual items, appear on the map at specific coordinates or in players’ collections.
  • The June 2024 MapLibre newsletter announced two minor releases of MapLibre GL JS, progress on a Vulkan backend for MapLibre Native, and the release of Martin Tile Server v0.14. It welcomed new sponsors and highlights upcoming events including FOSS4G EU and State of the Map Europe.
  • Amanda details improvements and ongoing issues with WaterwayMap.org, including a new flow direction grouping feature, bugs in river bifurcation calculations, and gaps caused by geojson-to-vector tile conversion, and invites feedback and discussion from the community.

Programming

  • emersonveenstra introduced the ‘Rapid Power User Extension’, a new Chrome/Firefox extension that integrates with OpenStreetMap to redirect edit buttons to Rapid and add Strava heatmap support as overlay imagery. The extension is in early development, and users are encouraged to report issues and suggestions on GitHub.
  • Mark Stosberg explored the optimisation of Minneapolis’ low-stress bicycle network connectivity using spatial analysis for generating isochrones to measure bicycle travel distances within the network. He described his process using QGIS, JOSM, and Valhalla to create a customised routing network and generate multiple isochrones. The aim is to prioritise segments for improvement based on their impact on overall connectivity.
  • The new osmapiR package is now published at CRAN, the official repository for R packages. After almost one year of development and polishing, the package implements all API calls and includes a complete documentation with examples for all functions. With this publication and existing packages osmdata (implementing overpass calls) and osmextract (work with .pbf files), R is now a first class language to work with OpenStreetMap.

Did you know …

  • … the map 1NITE TENT, where private individuals offer overnight accommodation with a tent on their property? This is particularly useful in countries where wild camping is prohibited.
  • … about the different tools to convert opening hours into OSM syntax, display them, and fix any errors?

Other “geo” things

  • Robin Wilson has created a demo app for searching an aerial image using text queries like “tennis courts” or “swimming pool”. Under the hood, it extracts embedding vectors from the SkyCLIP AI model for small chips of the image and compares them using vector similarity metrics.
  • Cameroon and Nigeria have agreed to resolve their long-standing border dispute through joint on-the-ground verification and demarcation, with the aim of completing the process by the end of 2025 without recourse to the courts. The agreement, facilitated by the Cameroon-Nigeria Mixed Commission, focuses on areas such as Rumsiki, Tourou, and Koche, and addresses the challenges posed by Boko Haram terrorism in the region.
  • tlohde discussed the concept and application of average colors in digital maps, highlighting how averaging colors can simplify images while maintaining their recognisable features.
  • Grant Slater shared that he has updated the ZA-aerial with all the latest 25 cm resolution aerial photos, related to the national coverage of South Africa, provided by the South African National Geo-spatial Information (NGI). The full announcement can be found in the mailing list of the OSGeo Africa.
  • The initial release of the Panoramax Android app, announced at the State of the Map France 2024, offers an alpha/beta version available for download as an APK, and will be published on the Play Store and F-Droid. The app allows users to contribute geolocated photos to the Panoramax database, a free alternative to Google Street View for OpenStreetMap.

Upcoming Events

Where What Online When Country
Tartu linn FOSS4G Europe 2024 2024-06-30 – 2024-07-07 flag
中正區 OpenStreetMap x Wikidata Taipei #66 2024-07-08 flag
Lyon Pique-nique OpenStreetMap 2024-07-09 flag
München Münchner OSM-Treffen 2024-07-09 flag
San Jose South Bay Map Night 2024-07-10 flag
Salt Lake City OSM Utah Monthly Map Night 2024-07-11 flag
Bochum Bochumer OSM Treffen 2024-07-10 flag
Lorain County OpenStreetMap Midwest Meetup 2024-07-11 flag
Amsterdam Maptime Amsterdam: Summertime Meetup 2024-07-11 flag
Berlin DRK Online Road Mapathon 2024-07-11 flag
Wildau 193. Berlin-Brandenburg OpenStreetMap Stammtisch 2024-07-11 flag
Zürich 165. OSM-Stammtisch Zürich 2024-07-11 flag
Portsmouth Introduction to OpenStreetMap at Port City Makerspace 2024-07-13 – 2024-07-14 flag
København OSMmapperCPH 2024-07-14 flag
Strasbourg découverte d’OpenStreetMap 2024-07-15 flag
Richmond MapRVA – Bike Lane Surveying & Mapping Meetup 2024-07-16 flag
England OSM UK Online Chat 2024-07-15 flag
Missing Maps London: (Online) Mid-Month Mapathon 2024-07-16
Bonn 177. OSM-Stammtisch Bonn 2024-07-16 flag
Hannover OSM-Stammtisch Hannover 2024-07-17 flag
Łódź State of the Map Europe 2024 2024-07-18 – 2024-07-21 flag
Zürich Missing Maps Zürich Mapathon 2024-07-18 flag
OSMF Engineering Working Group meeting 2024-07-19
Cocody OSM Africa July Mapathon – Map Ivory Cost 2024-07-20 flag
Stadtgebiet Bremen Bremer Mappertreffen 2024-07-22 flag

Note:
If you like to see your event here, please put it into the OSM calendar. Only data which is there, will appear in weeklyOSM.

This weeklyOSM was produced by Raquel Dezidério Souto, SeverinGeo, Strubbl, barefootstache, derFred, euroPathfinder, mcliquid, muramototomoya, rtnf.
We welcome link suggestions for the next issue via this form and look forward to your contributions.

Is Wikibase Right for Your Project?

Saturday, 6 July 2024 00:00 UTC

Wikibase, the powerful open-source software behind Wikipedia, offers robust features for structured data management. But is it the right choice for your project? Let's explore when Wikibase shines and when you might want to consider alternatives.

We also have a comparison between Wikibase and Semantic MediaWiki.

When to Consider Alternatives to Wikibase

1. Access Restriction Requirements

Wikibase is designed for either fully open or fully closed data environments. While editing restrictions can be implemented, viewing permissions are all-or-nothing: users who can access some data can access all data. Wikibase is thus less suitable for projects requiring granular access controls or a mix of open and restricted data within the same instance.

2. Real-time Data Processing

Wikibase is not suitable for real-time data processing or high-frequency updates. Stream processing systems or time-series databases are more appropriate for such cases. Wikibase's update speed limit is about 30 edits per second, depending on the underlying system resources.

When editing a Wikibase Item, the entire old version is kept. Thus, if you make many edits to large items, Wikibase ends up being wasteful with storage space.

Related: Fast Bulk Import Into Wikibase

3. Domain-Specific UIs

If your project requires forms or user interfaces with special restrictions or complex business logic for editing or viewing data, something other than Wikibase's standard interface may be required. However, custom development can address many of these needs.

At Professional Wiki, we've developed extensions like Wikibase Export for domain-specific data export and Automated Values for encoding business rules. If you wish to use Wikibase and need such customizations, check out our Wikibase software development services.

You might also wish to consider Semantic MediaWiki, a MediaWiki extension somewhat similar to Wikibase, that supports data-entry via domain-specific forms and more UI customization options. You can also check out our Wikibase vs. Semantic MediaWiki comparison.

4. Limited System Resources

Wikibase requires a relatively powerful server to run efficiently, especially for larger datasets or high-traffic scenarios. It's unsuitable for environments with minimal computing resources, such as serving data from a Raspberry Pi.

At Professional Wiki, we offer Wikibase hosting services that ensure optimal performance and reliability.

When Wikibase Excels

1. Collaborative Knowledge Creation

Because Wikibase is a layer on top of MediaWiki, the software developed for Wikipedia, it is fantastic for collaborative knowledge curation. Let your team(s) build and maintain your knowledge base together, or even open up your wiki to public editing. Wikibase comes with change logs, anti-vandalism tools, approval flows, and the ability to roll back changes.

2. Flexible Data Modeling

Create and evolve your own data model with Wikibase. Because Wikibase is built on top of a graph database, you avoid the artificial restriction of database tables. Define your properties and choose which ones you use on each item. Describe special cases, or do rapid prototyping without forcing your future self to live with a sub-optimal schema.

3. Interconnected Knowledge Representation

Wikibase excels at representing interconnected data. Its linked data model allows the creation of rich information networks with meaningfully connected entities. This structure enables intuitive navigation through complex datasets and supports powerful querying capabilities. By using external identifiers, you can connect your data to other data sets, such as Wikidata. Such connections enable federated queries that combine information from your and other Wikibases.

4. Multilingual and International Projects

With built-in support for multiple languages, Wikibase is ideal for international projects. It allows for seamless content management in various languages, including right-to-left scripts. Labels, descriptions, and aliases can be added in multiple languages for each entity, facilitating global collaboration and access.

5. Qualified and Referenced Data

Wikibase supports the addition of qualifiers and references to statements, providing context and provenance for each piece of information. This feature enhances data reliability and allows for a nuanced representation of complex or time-dependent facts, which is crucial for scientific, historical, or evolving datasets.

6. Version Control for Your Data

Every change in Wikibase is tracked and reversible. The system maintains a complete history of edits, allowing users to review past versions, compare changes, and revert to previous states if needed. This robust version control ensures data integrity and supports accountability in collaborative environments.

Common Wikibase Usecases

Wikibase's versatility makes it an ideal solution for various knowledge management needs. Here are some of the most common and impactful use cases we've seen among our clients:

Organizational Knowledge Management

Businesses increasingly turn to Wikibase to create flexible internal knowledge bases that can describe complex attributes and relationships. These knowledge bases can serve as a single source of truth for the entire organization and support analytics via complex queries against the knowledge graph.

Open Data Initiatives

Organizations leveraging Wikibase for open data initiatives benefit from its powerful combination of structured data management and accessibility. Government agencies, research institutions, and forward-thinking companies use Wikibase to create comprehensive data portals that foster transparency and innovation. A key advantage is Wikibase's adherence to open standards: through its Web API and SPARQL endpoint, data is easily retrievable in formats like JSON, RDF, and CSV, enabling seamless integration into various projects and applications.

Wikibase's structured data model facilitates complex queries, allowing users to uncover insights hidden in traditional databases. For instance, a city government could use Wikibase to publish urban planning data, enabling citizens to create custom visualizations of zoning changes or track infrastructure projects. Researchers might combine this with other sources to analyze urban development trends, while businesses could integrate it into location-based services.

Cultural Heritage Cataloging

GLAM institutions (Galleries, Libraries, Archives, and Museums) and historical research projects are harnessing Wikibase to revolutionize how cultural heritage is cataloged, linked, and explored. This versatile platform enables these organizations to create rich, interconnected knowledge bases that serve internal management needs and public engagement goals.

Libraries and archives use Wikibase to manage bibliographic records and metadata for diverse media, from ancient manuscripts to digital publications. For instance, a national library consortium might employ Wikibase to create a unified catalog that links books to authors and subjects to historical events, geographical locations, and related archival materials. This approach enhances resource discovery and facilitates advanced research by revealing hidden connections within collections.

Museums and galleries leverage Wikibase to catalog and manage their collections, including artworks, artifacts, and exhibits. A museum network could use the platform to build a comprehensive digital inventory that links objects across institutions, connecting them to their historical context, artistic movements, and conservation records. This linking streamlines curation processes and enables compelling narratives for public exhibitions and educational programs.

In historical research, Wikibase excels at managing and linking complex data. Projects focused on genealogy or local history can create vast information networks, connecting historical figures to events, places, and primary source documents. For example, a city archive might use Wikibase to organize and link historical photographs, census records, and maps, allowing researchers to trace the evolution of neighborhoods over time or track family histories over generations.

Research Data Management

Universities and research institutions harness Wikibase to create integrated research ecosystems. For example, a university might use Wikibase to build a repository that stores research outputs and maps the relationships between publications, datasets, researchers, and funding sources. This interconnected approach facilitates interdisciplinary collaboration, helps demonstrate research impact, and supports compliance with data management requirements from funding bodies.

Conclusion

Wikibase is a powerful solution for organizations dealing with complex, interconnected data requiring a flexible and robust management system. Its strengths in adaptable data modeling, support for qualified and referenced information, and comprehensive version control make it well-suited for cultural heritage projects, research data management, and open data initiatives.

However, Wikibase isn't the right fit for every project. Organizations needing real-time data processing, highly specific user interfaces, or granular access controls may need to look elsewhere or consider custom development on top of Wikibase.

If you're considering Wikibase for your project or looking to optimize your existing Wikibase implementation, Professional Wiki offers comprehensive Wikibase services to support your needs. Our team of Wikibase experts can guide you through the decision-making process, assist with importing into Wikibase, host your Wikibase, and even develop new Wikibase features.

WikiCon Australia 2024

Thursday, 4 July 2024 12:00 UTC
Submissions are open for WikiCon
. Keywords: WikiCon Australia, WikiCon

WikiCon 2024 will be held on Saturday 23rd of November 2024 in Adelaide, South Australia.

Submissions are now invited for WikiCon Australia 2024. We encourage submissions from anyone interested in Wikipedia and its sister projects, with special consideration given to the work of Wikimedians in Australian, South East Asia and the Pacific regions. Closing date for submissions is 31 July 2024.

Further information about the submission process and travel scholarships are available on meta-wiki.

It is also anticipated that a number of pre-conference activities will be available for those arriving in Adelaide on Friday, November 22nd. Please register your interest for activities and catering on Humanitix.

Date: Saturday 23rd November

Venue: Ibis Adelaide

Contacts: contact@wikimedia.org.au

Conference Website: Visit the Conference webpage on Meta-Wiki

Summary: this article shares the experience and learnings of migrating away from Kubernetes PodSecurityPolicy into Kyverno in the Wikimedia Toolforge platform.

Christian David, CC BY-SA 4.0, via Wikimedia Commons

Wikimedia Toolforge is a Platform-as-a-Service, built with Kubernetes, and maintained by the Wikimedia Cloud Services team (WMCS). It is completely free and open, and we welcome anyone to use it to build and host tools (bots, webservices, scheduled jobs, etc) in support of Wikimedia projects. 

We provide a set of platform-specific services, command line interfaces, and shortcuts to help in the task of setting up webservices, jobs, and stuff like building container images, or using databases. Using these interfaces makes the underlying Kubernetes system pretty much invisible to users. We also allow direct access to the Kubernetes API, and some advanced users do directly interact with it.

Each account has a Kubernetes namespace where they can freely deploy their workloads. We have a number of controls in place to ensure performance, stability, and fairness of the system, including quotas, RBAC permissions, and up until recently PodSecurityPolicies (PSP). At the time of this writing, we had around 3.500 Toolforge tool accounts in the system.
We early adopted PSP in 2019 as a way to make sure Pods had the correct runtime configuration. We needed Pods to stay within the safe boundaries of a set of pre-defined parameters. Back when we adopted PSP there was already the option to use 3rd party agents, like  OpenPolicyAgent Gatekeeper, but we decided not to invest in them, and went with a native, built-in mechanism instead.

In 2021 it was announced that the PSP mechanism would be deprecated, and removed in Kubernetes 1.25. Even though we had been warned years in advance, we did not prioritize the migration of PSP until we were in Kubernetes 1.24, and blocked, unable to upgrade forward without taking actions.

The WMCS team explored different alternatives for this migration, but eventually we decided to go with Kyverno as a replacement for PSP. And so with that decision it began the journey described in this blog post.

First, we needed a source code refactor for one of the key components of our Toolforge Kubernetes: maintain-kubeusers. This custom piece of software that we built in-house, contains the logic to fetch accounts from LDAP and do the necessary instrumentation on Kubernetes to accommodate each one: create namespace, RBAC, quota, a kubeconfig file, etc. With the refactor, we introduced a proper reconciliation loop, in a way that the software would have a notion of what needs to be done for each account, what would be missing, what to delete, upgrade, and so on. This would allow us to easily deploy new resources for each account, or iterate on their definitions. 

The initial version of the refactor had a number of problems, though. For one, the new version of maintain-kubeusers was doing more filesystem interaction than the previous version, resulting in a slow reconciliation loop over all the accounts. We used NFS as the underlying storage system for Toolforge, and it could be very slow because of reasons beyond this blog post. This was corrected in the next few days after the initial refactor rollout. A side note with an implementation detail: we stored a configmap on each account namespace with the state of each resource. Storing more state on this configmap was our solution to avoid additional NFS latency.

I initially estimated this refactor would take me a week to complete, but unfortunately it took me around three weeks instead. Previous to the refactor, there were several manual steps and cleanups required to be done when updating the definition of a resource. The process is now automated, more robust, performant, efficient and clean. So in my opinion it was worth it, even if it took more time than expected.

Then, we worked on the Kyverno policies themselves. Because we had a very particular PSP setting, in order to ease the transition, we tried to replicate their semantics on a 1:1 basis as much as possible. This involved things like transparent mutation of Pod resources, then validation. Additionally, we had one different PSP definition for each account, so we decided to create one different Kyverno namespaced policy resource for each account namespace — remember, we had 3.5k accounts.

We created a Kyverno policy template that we would then render and inject for each account.

For developing and testing all this, maintain-kubeusers and the Kyverno bits, we had a project called lima-kilo, which was a local Kubernetes setup replicating production Toolforge. This was used by each engineer in their laptop as a common development environment.

We had planned the migration from PSP to Kyverno policies in stages, like this:

  1. update our internal template generators to make Pod security settings explicit
  2. introduce Kyverno policies in Audit mode
  3. see how the cluster would behave with them, and if we had any offending resources reported by the new policies, and correct them
  4. modify Kyverno policies and set them in Enforce mode
  5. drop PSP

In stage 1, we updated things like the toolforge-jobs-framework and tools-webservice.

In stage 2, when we deployed the 3.5k Kyverno policy resources, our production cluster died almost immediately. Surprise. All the monitoring went red, the Kubernetes apiserver became irresponsibe, and we were unable to perform any administrative actions in the Kubernetes control plane, or even the underlying virtual machines. All Toolforge users were impacted. This was a full scale outage that required the energy of the whole WMCS team to recover from. We temporarily disabled Kyverno until we could learn what had occurred.

This incident happened despite having tested before in lima-kilo and in another pre-production cluster we had, called Toolsbeta. But we had not tested that many policy resources. Clearly, this was something scale-related. After the incident, I went on and created 3.5k Kyverno policy resources on lima-kilo, and indeed I was able to reproduce the outage. We took a number of measures, corrected a few errors in our infrastructure,  reached out to the Kyverno upstream developers, asking for advice, and at the end we did the following to accommodate the setup to our needs.:

  • corrected the external HAproxy kubernetes apiserver health checks, from checking just for open TCP ports, to actually checking the /healthz HTTP endpoint, which more accurately reflected the health of each k8s apiserver.
  • having a more realistic development environment. In lima-kilo, we created a couple of helper scripts to create/delete 4000 policy resources, each on a different namespace.
  • greatly over-provisioned memory in the Kubernetes control plane servers. This is, bigger memory in the base virtual machine hosting the control plane. Scaling the memory headroom of the apiserver would prevent it from running out of memory, and therefore crashing the whole system. We went from 8GB RAM per virtual machine to 32GB.  In our cluster, a single apiserver pod could eat 7GB of memory on a normal day, so having 8GB on the base virtual machine was clearly not enough. I also sent a patch proposal to Kyverno upstream documentation suggesting they clarify the additional memory pressure on the apiserver.
  • corrected resource requests and limits of Kyverno, to more accurately describe our actual usage.
  • increased the number of replicas of the Kyverno admission controller to 7, so admission requests could be handled more timely by Kyverno.

I have to admit, I was briefly tempted to drop Kyverno, and even stop pursuing using an external policy agent entirely, and write our own custom admission controller out of concerns over performance of this architecture. However, after applying all the measures listed above, the system became very stable, so we decided to move forward. The second attempt at deploying it all went through just fine. No outage this time 🙂

When we were in stage 4 we detected another bug. We had been following the Kubernetes upstream documentation for setting securityContext to the right values. In particular, we were enforcing the procMount to be set to the default value, which per the docs it was ‘DefaultProcMount’. However, that string is the name of the internal variable in the source code, whereas the actual default value is the string ‘Default’. This caused pods to be rightfully rejected by Kyverno while we figured the problem. We sent a patch upstream to fix this problem.

We finally had everything in place, reached stage 5, and we were able to disable PSP. We unloaded the PSP controller from the kubernetes apiserver, and deleted every individual PSP definition. Everything was very smooth in this last step of the migration.

This whole PSP project, including the maintain-kubeusers refactor, the outage, and all the different migration stages took roughly three months to complete.

For me there are a number of valuable reasons to learn from this project. For one, the scale is something to consider, and test, when evaluating a new architecture or software component. Not doing so can lead to service outages, or unexpectedly poor performances. This is in the first chapter of the SRE handbook, but we got a reminder the hard way 🙂