# set_bearer() # then set this up as an environment variable and restart R, so reload packages# get_bearer()
5.1 Data Collection
5.1.1 Twitter handles
Through Media.info I was able to retrieve very few handles. Thus I resorted to extracting twitter handles from the available websites in my directory. Not all urls were still active, but luckily some of them redirected to their new sites. One example is the Evening Express, which is now under the Press and Journal, in Aberdeen.
From 1059 web urls, I manage to extract 450 handles.
# ids <- drop_na(ids)# id <- na.omit(ids$id)# followers <- data.frame()# for (i in id) { # iteration_i <- get_user_followers(# x = i,# bearer_token = get_bearer())# followers <- bind_rows(followers, iteration_i)}# # saveRDS(followers, "followers.RDS")# currently gathered this data (out of 441 handles):# -1:65 done # -100:120 done on Julien's PC# -121:126 done in background job from script "followers_second_script.R"# -400:end Pablo is collecting # left to do:# 66:99# 127:210# 211:300# 301:400
5.1.5 Tweet retweeters
I have paused this.
# tweets_retweets_likes <- tweets %>% # select(author_id, conversation_id, text, public_metrics) %>% # filter(!str_detect(text, "RT @"))# # tweets_id <- tweets_retweets_likes$conversation_id# # # the function hydrate_tweets() is not so useful, instead the one below is good, it includes geo location of retweeters: started running at 13:48 8/2# tweets_retweets <- data.frame()# for (id in tweets_id) {# iteration_x <- get_retweeted_by(# id,# bearer_token = get_bearer(),# data_path = "retweets/",# verbose = TRUE# )# tweets_retweets <- bind_rows(tweets_retweets, iteration_x)# }# # saveRDS(tweets_retweets, "tweets_retweets.RDS")
5.1.6 Tweet likers
I have paused this.
# tweets_liking_users <- data.frame()# for (id in tweets_id) {# iteration_x <- get_liking_users(# id,# bearer_token = get_bearer(),# verbose = TRUE# )# tweets_liking_users <- bind_rows(tweets_liking_users, iteration_x)# }
5.2 Analysis
# # dates# tweets_2021 %>% select(created_at) %>% mutate(date = as.Date(created_at)) %>% # ggplot(aes(x = date))+ # ok so I found out my tweets from 2021 actually went all the way to 2023# geom_histogram(stat="count")
5 Analysing engagement with local news outlets on social media
In this Chapter, I use my AcademicTwitter API access to extract and analyse engagement on Twitter between local news outlets and their communities.
5.1 Data Collection
5.1.1 Twitter handles
Through Media.info I was able to retrieve very few handles. Thus I resorted to extracting twitter handles from the available websites in my directory. Not all urls were still active, but luckily some of them redirected to their new sites. One example is the Evening Express, which is now under the Press and Journal, in Aberdeen.
From 1059 web urls, I manage to extract 450 handles.
5.1.2 News outlets id and profile information
5.1.3 Tweets
Got all 2020, 2021, and 2022 tweets for 450 of my news outlets.
5.1.4 Twitter profile followers
This took forever but now it’s done.
5.1.5 Tweet retweeters
I have paused this.
5.1.6 Tweet likers
I have paused this.
5.2 Analysis
5.2.1 Outlet location vs followers location
5.2.2 Outlet location vs engagers location
5.2.3 Post location vs engagers location