Skip to content

Commit c58d769

Browse files
committed
update readme
1 parent f79686c commit c58d769

File tree

1 file changed

+13
-10
lines changed

1 file changed

+13
-10
lines changed

README.md

Lines changed: 13 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -13,8 +13,6 @@
1313
| Code | [GitHub](https://github.com/ScholCommLab/fhe-plos)|
1414
| Data | [Dataverse](https://dataverse.harvard.edu/privateurl.xhtml?token=58246dfc-bdf8-454d-8edc-60d5918dedfc) |
1515

16-
This repository is part of a broader investigation of the hidden engagement on Facebook. More information about the project can be found [here](https://github.com/ScholCommLab/facebook-hidden-engagement).
17-
1816
---
1917

2018
This repository contains all figures and tables present in the manuscript for "How much research shared on Facebook is hidden from public view?". Output files can be found in:
@@ -28,9 +26,11 @@ Furthermore, all the input data and code required to reproduce results are provi
2826
- `prepare_data.py` - data preprocessing
2927
- `analysis.py` - data analysis and outputs
3028

31-
## Inital Data Collection
29+
This article is part of a broader investigation of the hidden engagement on Facebook. More information about the project can be found [here](https://github.com/ScholCommLab/facebook-hidden-engagement).
30+
31+
## Initial Data Collection
3232

33-
The data used in this paper was collected using our own methods. The data collection method is described in [Enkhbayar and Alperin (2018)](https://arxiv.org/abs/1809.01194). Code & instructions can be found [here](https://github.com/ScholCommLab/fhe-plos).
33+
The data used in this paper was collected using our own methods. The data collection method is described in [Enkhbayar and Alperin (2018)(https://arxiv.org/abs/1809.01194)]. Code & instructions can be found [here](https://github.com/ScholCommLab/fhe-plos).
3434

3535
## Reproduce results
3636

@@ -40,24 +40,27 @@ Packages specified in `requirements.txt` can be installed via
4040

4141
```pip install -r requirements.txt```
4242

43-
1. Clone this repository and cd into it
43+
1. Clone this repository and cd into the scripts folder
4444

4545
```
4646
git clone git@github.com:ScholCommLab/fhe-plos-paper.git
47-
cd fhe-plos-paper
47+
cd fhe-plos-paper/scripts
4848
```
4949
5050
2. Download data from Dataverse.
5151
5252
All the data is hosted on dataverse: [Dataverse repository](https://dataverse.harvard.edu/privateurl.xhtml?token=58246dfc-bdf8-454d-8edc-60d5918dedfc)
5353
54-
Using the helper script provided, you can download all files into the respective locations.
54+
Using the helper script provided, you can download all files into the respective locations. Make the script executable and ensure that you have `wget` installed.
5555
56-
```download_data.sh```
56+
```
57+
chmod +x download_data.sh
58+
./download_data.sh
59+
```
5760
5861
3. Preprocess data
5962
60-
Run the preprocessing script to apply transformations on the input dataset.
63+
Run the preprocessing script to apply transformations on the input dataset. This step creates the file `data/articles.csv`
6164
6265
```python process_data.py```
6366
@@ -67,4 +70,4 @@ Packages specified in `requirements.txt` can be installed via
6770
6871
```python analysis.py```
6972
70-
Optionally, you can also open the analysis notebook with Jupyter to explore the dataset.
73+
Optionally, you can also open the notebook `analysis.ipynb` with Jupyter to explore the dataset and results.

0 commit comments

Comments
 (0)