You are here: Home » Technology » News » People
Business Standard

What's wrong with letting tech run our schools

Tech companies tend to offer free services in return for data access, that raises privacy concerns

Bloomberg 

What's wrong with letting tech run our schools
Tech companies tend to offer free services in return for access to data, a deal that raises some serious privacy concerns.

tech moguls are conducting an enormous experiment on the nation’s children. We should not be so trusting that they’ll get it right. has taken a big role in public education, offering low-cost laptops and free apps. of Facebook is investing heavily in educational technology, largely through the Netflix, head Reed Hastings has been tinkering with expensive and algorithmic ed-tech tools.

Encouraging as all this may be, the technologists might be getting ahead of themselves, both politically and ethically. Also, there’s not a lot of evidence that what they’re doing works. Like it or not, education is political. on opposite sides of the spectrum read very different science books, and can’t seem to agree on fundamental principles. It stands to reason that what we choose to teach our children will vary, depending on our beliefs. That’s to acknowledge, not defend, anti-scientific curricula.

Zuckerberg and Bill Gates learned this the hard way last year when the ordered the closure of 60 schools — part of a network providing highly scripted, low-cost education in Africa — amid allegations that they had been “teaching pornography” and “conveying the gospel of homosexuality” in sex-ed classes. Let’s face it, something similar could easily happen here if tech initiatives expand beyond the apolitical math subjects on which they have so far focused.

Beyond that, there are legitimate reasons to be worried about letting tech companies wield so much influence in classrooms. They tend to offer “free services” in return for access to data, a deal that raises some serious privacy concerns — particularly if you consider that it can involve tracking kids’ every click, keystroke and backspace from kindergarten on.

My oldest son is doing extremely well as a junior in school right now, but he was a late bloomer who didn’t learn to read until third grade. Should that be a part of his permanent record, data that future algorithms could potentially use to assess his suitability for credit or a job? Or what about a kid whose “persistence score” on dynamic, standardised tests waned in 10th grade? Should colleges have access to that information in making their admissions decisions?

These are not far-fetched scenarios. Consider the fate of nonprofit education venture InBloom, which sought to collect and integrate student records in a way that would allow lessons to be customised. The venture shut down a few years ago amid concerns about how sensitive information — including tags identifying students as “tardy” or “autistic” — would be protected from theft and shared with outside vendors.

Google and others are collecting similar data and using it internally to improve their software. Only after some prompting did Google agree to comply with the privacy law known as FERPA, which had been weakened for the purpose of third-party sharing. It’s not clear how the data will ultimately be used.


RECOMMENDED FOR YOU

What's wrong with letting tech run our schools

Tech companies tend to offer free services in return for data access, that raises privacy concerns

Tech companies tend to offer free services in return for data access, that raises privacy concerns
tech moguls are conducting an enormous experiment on the nation’s children. We should not be so trusting that they’ll get it right. has taken a big role in public education, offering low-cost laptops and free apps. of Facebook is investing heavily in educational technology, largely through the Netflix, head Reed Hastings has been tinkering with expensive and algorithmic ed-tech tools.

Encouraging as all this may be, the technologists might be getting ahead of themselves, both politically and ethically. Also, there’s not a lot of evidence that what they’re doing works. Like it or not, education is political. on opposite sides of the spectrum read very different science books, and can’t seem to agree on fundamental principles. It stands to reason that what we choose to teach our children will vary, depending on our beliefs. That’s to acknowledge, not defend, anti-scientific curricula.

Zuckerberg and Bill Gates learned this the hard way last year when the ordered the closure of 60 schools — part of a network providing highly scripted, low-cost education in Africa — amid allegations that they had been “teaching pornography” and “conveying the gospel of homosexuality” in sex-ed classes. Let’s face it, something similar could easily happen here if tech initiatives expand beyond the apolitical math subjects on which they have so far focused.

Beyond that, there are legitimate reasons to be worried about letting tech companies wield so much influence in classrooms. They tend to offer “free services” in return for access to data, a deal that raises some serious privacy concerns — particularly if you consider that it can involve tracking kids’ every click, keystroke and backspace from kindergarten on.

My oldest son is doing extremely well as a junior in school right now, but he was a late bloomer who didn’t learn to read until third grade. Should that be a part of his permanent record, data that future algorithms could potentially use to assess his suitability for credit or a job? Or what about a kid whose “persistence score” on dynamic, standardised tests waned in 10th grade? Should colleges have access to that information in making their admissions decisions?

These are not far-fetched scenarios. Consider the fate of nonprofit education venture InBloom, which sought to collect and integrate student records in a way that would allow lessons to be customised. The venture shut down a few years ago amid concerns about how sensitive information — including tags identifying students as “tardy” or “autistic” — would be protected from theft and shared with outside vendors.

Google and others are collecting similar data and using it internally to improve their software. Only after some prompting did Google agree to comply with the privacy law known as FERPA, which had been weakened for the purpose of third-party sharing. It’s not clear how the data will ultimately be used.
image
Business Standard
177 22

What's wrong with letting tech run our schools

Tech companies tend to offer free services in return for data access, that raises privacy concerns

tech moguls are conducting an enormous experiment on the nation’s children. We should not be so trusting that they’ll get it right. has taken a big role in public education, offering low-cost laptops and free apps. of Facebook is investing heavily in educational technology, largely through the Netflix, head Reed Hastings has been tinkering with expensive and algorithmic ed-tech tools.

Encouraging as all this may be, the technologists might be getting ahead of themselves, both politically and ethically. Also, there’s not a lot of evidence that what they’re doing works. Like it or not, education is political. on opposite sides of the spectrum read very different science books, and can’t seem to agree on fundamental principles. It stands to reason that what we choose to teach our children will vary, depending on our beliefs. That’s to acknowledge, not defend, anti-scientific curricula.

Zuckerberg and Bill Gates learned this the hard way last year when the ordered the closure of 60 schools — part of a network providing highly scripted, low-cost education in Africa — amid allegations that they had been “teaching pornography” and “conveying the gospel of homosexuality” in sex-ed classes. Let’s face it, something similar could easily happen here if tech initiatives expand beyond the apolitical math subjects on which they have so far focused.

Beyond that, there are legitimate reasons to be worried about letting tech companies wield so much influence in classrooms. They tend to offer “free services” in return for access to data, a deal that raises some serious privacy concerns — particularly if you consider that it can involve tracking kids’ every click, keystroke and backspace from kindergarten on.

My oldest son is doing extremely well as a junior in school right now, but he was a late bloomer who didn’t learn to read until third grade. Should that be a part of his permanent record, data that future algorithms could potentially use to assess his suitability for credit or a job? Or what about a kid whose “persistence score” on dynamic, standardised tests waned in 10th grade? Should colleges have access to that information in making their admissions decisions?

These are not far-fetched scenarios. Consider the fate of nonprofit education venture InBloom, which sought to collect and integrate student records in a way that would allow lessons to be customised. The venture shut down a few years ago amid concerns about how sensitive information — including tags identifying students as “tardy” or “autistic” — would be protected from theft and shared with outside vendors.

Google and others are collecting similar data and using it internally to improve their software. Only after some prompting did Google agree to comply with the privacy law known as FERPA, which had been weakened for the purpose of third-party sharing. It’s not clear how the data will ultimately be used.

image
Business Standard
177 22