Wiki-MID: a very large Multi-domain Interests Dataset of Twitter users with mappings to Wikipedia

New! (May 2020)

message-based interests for both Italian and English are now provided with timestamps:


Wiki-MID Dataset

Wiki-MID is a LOD compliant multi-domain interests dataset to train and test Recommender Systems. Our English dataset includes an average of 90 multi-domain preferences per user on music, books, movies, celebrities, sport, politics and much more, for about half million Twitter users traced during six months in 2017. Preferences are either extracted from messages of users who use Spotify, Goodreads and other similar content sharing platforms, or induced from their "topical" friends, i.e., followees representing an interest rather than a social relation between peers. In addition, preferred items are matched with Wikipedia articles describing them. This unique feature of our dataset provides a mean to categorize preferred items, exploiting available semantic resources linked to Wikipedia such as the Wikipedia Category Graph, DBpedia, BabelNet and others.

Data model:

Figure 1: The data model adopted for the design of our resource.

Our resource is designed on top of the Semantically-Interlinked Online Communities (SIOC) core ontology. The SIOC ontology favors the inclusion of data mined from social networks communities into the Linked Open Data (LOD) cloud. As shown in Figure 1 we represent Twitter users as instances of the SIOC UserAccount class. Topical users and message based user interests are then associated, through the usage of the Simple Knowledge Organization System Namespace Document (SKOS) predicate relatedMatch, to a corresponding Wikipedia page as a result of our automated mapping methodology.


To better understand the released resource we provide in this section an instance example of the adopted data model. Let "" be a generic Twitter user and "" be the Twitter account corresponding to a friend of "" (i.e. "Katy Perry").
With the following set of triples we first define the two Twitter users as instances of the SIOC UserAccount class and second with the last triple we define the relations representing the fact that first user do follows the second user:

<> <> <>  .
<> <> <>  .  
<> <> <>  .

With the following triple we define that "" is a related match to to the wikipedia page "Katy_Perry":

<> <> <> .

With the following set of triples we first define that "" is interested by the movie "":

<> <> <> .

And finally, with the following triple we provide details about the extracted interest:

<> <> <> .


If you use the Wiki-MID Dataset in your research, please cite this publication:

    Di Tommaso, Giorgia and Faralli, Stefano and Stilo, Giovanni and Velardi, Paola
    Wiki-MID: a very large Multi-domain Interests Dataset of Twitter users with mappings to Wikipedia
    In Proceedings of the The 17th International Semantic Web Conference, ISWC2018,
     2018, Monterey (California)

       Paper        Bibtex
The resources are licensed under:
      Creative Commons Attribution-Non Commercial-Share Alike 4.0 License.


How to use it with Jena:

 1 package it.uniroma1.wikimid;
 3 import org.apache.jena.query.*;
 4 import org.apache.jena.rdf.model.Model;
 5 import org.apache.jena.rdf.model.ModelFactory;
 6 import org.apache.jena.shared.PrefixMapping;
 8 import*;
10 public class App {
12     static final String datasetPath = "in/Wiki-MID_IT.nt";
14     public static void main(String[] args) throws IOException {
16         Model model = ModelFactory.createDefaultModel();
17         model.setNsPrefixes(PrefixMapping.Standard);
18 FileReader(datasetPath), "UTF-8", "N-TRIPLES");
19         String queryString = "SELECT ?x WHERE { ?x  <> <> }";
21         Query qry = QueryFactory.create(queryString);
22         try (QueryExecution qe = QueryExecutionFactory.create(qry, model)) {
23             ResultSet rs = qe.execSelect();
25             while (rs.hasNext()) {
26                 QuerySolution sol = rs.nextSolution();
27                 System.out.println(sol);
28             }
29         }
30     }
32 }

Software Repository:

We provide a code repository to share part of the pipiline components used for the construction of the Wiki-MID resource: