Hello.
The Wikidata page of "Petr Janiš" (Q95392172) and the Wikidata page of "Petr Janiš" (Q95773157), are about the same Czech actor born im 1951, and need to be merged.
Yours sincerely, 31.200.16.100 11:11, 5 February 2024 (UTC)
Welcome to Wikidata, Queryzo!
Wikidata is a free knowledge base that you can edit! It can be read and edited by humans and machines alike and you can go to any item page now and add to this ever-growing database!
Need some help getting started? Here are some pages you can familiarize yourself with:
If you have any questions, please ask me on my talk page. If you want to try out editing, you can use the sandbox to try. Once again, welcome, and I hope you quickly feel comfortable here, and become an active editor for Wikidata.
Best regards! Taketa
Previous discussion was archived at User talk:Queryzo/Archive 1 on 2016-02-08.
Hello.
The Wikidata page of "Petr Janiš" (Q95392172) and the Wikidata page of "Petr Janiš" (Q95773157), are about the same Czech actor born im 1951, and need to be merged.
Yours sincerely, 31.200.16.100 11:11, 5 February 2024 (UTC)
Dear Queryzo,
I hope you are doing good,
I am Kholoud, a researcher at King's College London, and I work on a project as part of my PhD research, in which I have developed a personalised recommender system that suggests Wikidata items for the editors based on their past edits. I am collaborating on this project with Elena Simperl and Miaojing Shi.
I am inviting you to a task-based study that will ask you to provide your judgments about the relevance of the items suggested by our system based on your previous edits.
Participation is completely voluntary, and your cooperation will enable us to evaluate the accuracy of the recommender system in suggesting relevant items to you. We will analyse the results anonymised, and they will be published to a research venue.
The study will start in late January 2022 or early February 2022, and it should take no more than 30 minutes.
If you agree to participate in this study, please either contact me at kholoud.alghamdi@kcl.ac.uk or use this form https://docs.google.com/forms/d/e/1FAIpQLSees9WzFXR0Vl3mHLkZCaByeFHRrBy51kBca53euq9nt3XWog/viewform?usp=sf_link
I will contact you with the link to start the study.
For more information about the study, please read this post: https://www.wikidata.org/wiki/User:Kholoudsaa
In case you have further questions or require more information, don't hesitate to contact me through my mentioned email.
Thank you for considering taking part in this research.
Regards
Dear Wikdiata editors,
I hope you are doing good,
I am Kholoud, a researcher at King’s College London, and I work on a project as part of my PhD research that develops a personalized recommendation system to suggest Wikidata items for the editors based on their interests and preferences. I am collaborating on this project with Elena Simperl and Miaojing Shi.
I would love to talk with you to know about your current ways to choose the items you work on in Wikidata and understand the factors that might influence such a decision. Your cooperation will give us valuable insights into building a recommender system that can help improve your editing experience.
Participation is completely voluntary. You have the option to withdraw at any time. Your data will be processed under the terms of UK data protection law (including the UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018). The information and data that you provide will remain confidential; it will only be stored on the password-protected computer of the researchers. We will use the results anonymized to provide insights into the practices of the editors in item selection processes for editing and publish the results of the study to a research venue. If you decide to take part, we will ask you to sign a consent form, and you will be given a copy of this consent form to keep.
If you’re interested in participating and have 15-20 minutes to chat (I promise to keep the time!), please either contact me at kholoudsaa@gmail.com or use this form https://docs.google.com/forms/d/e/1FAIpQLSdmmFHaiB20nK14wrQJgfrA18PtmdagyeRib3xGtvzkdn3Lgw/viewform?usp=sf_link with your choice of the times that work for you.
I’ll follow up with you to figure out what method is the best way for us to connect.
Please contact me using the email mentioned above if you have any questions or require more information about this project.
Thank you for considering taking part in this research.
Regards
Kholoud
You started adding the episodes, but you stopped at episode 1711, with only 47 episodes missing. Why?
Dear Queryzo! Can you merge from Q10997761 to Q19571141? Thank you! --~~~~
Dear Queryzo! Can you merge from Q6069036 to Q31192949 and from Q6057437 to Q31192987? Thank you! --~~~~
Hallo Queryzo, du hast Claudia Bach (geb. 1965; Q95192072) mit Claudia Bach (geb. 1979; Q1097313) gemergt. Die IMDb kennt noch zwei weitere Claudia Bachs, daher sollte man imho die Einträge getrennt lassen und gegebenenfalls mit "eventuell gleichwertig" (P460) markieren.
Nachtrag: Hat sich schon erledigt. GND und VIAF vermischen zwei Personen. Ursprünglich wurde GND 1093573813 für die 1979er Claudia Bach angelegt. Die Zusammenführung kann daher bleiben. Für die 1965er Claudia Bach habe ich ein neues Objekt angelegt und schicke eine Fehlermeldung an die GND-Redaktion.
Könntest du hierzu auch die GND kontaktieren? Dort wäre dann rechts unten der Link zum Filmportal falsch. Danke!
Hallo, ich bin mir echt nicht im Klaren, was solche Nachweise bringen sollen. Wenn dann sollten Nachweise schon dahin gehen, wo der Nachweis auch wirklich ist. Aber auch sonst ist das nicht nötig, da es schon einen anderen validen Beleg gibt. Wie viele braucht es denn deiner Meinung nach, um einen Fakt zu belegen? Wäre es nur der eine Datensatz, würde ich ja nichts sagen. Aber meine Beobachtungsliste ist voll davon.
Die sind genauso sinnvoll wie "importiert aus Wikimedia-Projekt" -> "deutschsprachige Wikipedia", davon kann man nun halten, was man will. In dem Fall kann man nachvollziehen, woher die Angabe kommt bzw. wo diese belegt sind, vor allem wenn es zig Identifier gibt. Leider haben nicht alle Angaben bereits andere Belege wie in deinem Beispiel - das vorher abzufragen, hätte Tage gedauert. Meine Beobachtungsliste hat übrigens der Edoderoobot geflutet, indem er hunderten von neuen Datenobjekten hinterher editiert und als nl-Beschreibung "film" ergänzt hat. Von asturischen Personen-Bezeichnungen, tausenden Douban- und TMDb-IDs mal ganz abgesehen. Ich bitte um Verständnis für die Unannehmlichkeiten.
Also bei den Nachweisen über die Wikimedia-Projekte bin ich absolut bei dir. Wann immer ich kann lösche ich das wieder. Wäre hier auch möglich gewesen. Es geht mir auch nicht um Unannehmlichkeiten. Es geht mir wirklich um Sinnhaftigkeit. Kann man das nicht wenigstens auf die Objekte beschränken, die keine oder nur Wikimedia-Nachweise haben?
Wie gesagt, das wäre leider wesentlich umständlicher bzw. (für mich) technisch nicht umsetzbar gewesen. Daher leider das Gießkannenprinzip.
Hi! Thank you for correcting my edit. I added the ordinal because Q80087856 is stated as part (episode 10) of season 4 (Q80032847). Admittedly I may be out of my depth in my additions here, and I will certainly watch for other errors, but seems this one ought to be fixed further?
Yes, I just fixed it: https://www.wikidata.org/w/index.php?title=Q80032847&diff=prev&oldid=1305163271
Oh, great! I intend to do some writeup of what/how I'm doing, and should probably add some checks before proceeding... Another similar error was spotted which I commented at User_talk:Kam_Solusar#Bidirectional_series_ordinals - if you're more experienced here than I am (I'm learning to explore WDQS), do you have any advice for how I could proceed?
With WDQS you mean Wikidata Quick Statements or Wikidata Query Service? Usually you create lists of items with Query Service which then should be updated then with Quick Statements. In this case you can add colums for Property:P361, which was former used instead of Property:P4908. Those P361 values can be deleted after adding P4908 statements. But: It should be considered that parts of double episodes have P361 AND P4908. A simple way to do this is Pet Scan, which also can use Query Service.
SELECT ?item ?itemLabel ?season ?seasonLabel ?number WHERE {
?item wdt:P31 wd:Q21191270 .
MINUS { ?item wdt:P31 wd:Q21664088 .}
?item p:P4908 ?season_statement.
?season_statement ps:P4908 ?season .
?season_statement pq:P1545 ?number.
?item p:P361 ?is_part_of_statement.
?is_part_of_statement ps:P361 ?season .
?is_part_of_statement pq:P1545 ?number.
SERVICE wikibase:label { bd:serviceParam wikibase:language "en" }
} ORDER BY ?item
Hello Vunj9rxopceoe4i1,
Really sorry for the inconvenience. This is a gentle note to request that you check your email. We sent you a message titled "The Community Insights survey is coming!". If you have questions, email surveys@wikimedia.org.
You can see my explanation here.