00:37.4
Pero ang pagkakaintindi ko, hindi siya sa Meta platform, diba?
00:42.0
Pero tama yung sinabi mo na usually yung mga ganitong klaseng content,
00:47.1
minsan it crosses platforms.
00:50.3
Hindi lang siya sa isang platform lang, diba?
00:54.2
So we see that a lot also.
00:55.7
For example, merong post na,
00:57.8
na nag-appear sa Twitter or sa X.
01:01.1
And then, screenshot, iri-re-share siya sa ibang platforms,
01:05.2
like Instagram or Facebook.
01:06.8
So we see a lot of cross-platform activity.
01:09.6
Although itong deepfake na kumalat sa ibang platform,
01:15.7
personally, hindi ko ito nakita.
01:17.9
Kasi dun sa platform na yun, kasi na-remove na siya.
01:20.8
And I am not sure kung kumakalat siya sa Meta's platform.
01:26.1
Sa Facebook or Instagram.
01:27.8
Okay, before we continue, konting public service lang sa mga naguguluhan pa rin, oh.
01:32.2
I'll talk about the style book.
01:33.4
The style book, Jen, bear with me for a few minutes.
01:38.8
Hindi naman minutes, seconds lang.
01:40.2
Nanitatanong ko sa iyo,
01:41.0
Mr. Christian, bakit hindi mo po ina-address na presidente si BBM?
01:47.1
Gawin natin, mahal na Pangulong Ferdinand Bongbong Marcos Jr.
01:49.9
Okay na po ba yan?
01:51.8
Problema ko yan, last administration pa yan.
01:54.4
Meron po kasing tinatawag na style book sa journalism.
01:57.3
First mention, President Bongbong Marcos Jr., President Marcos Jr.
02:01.6
Subsequent mentions, pwedeng Marcos, pwedeng BBM.
02:05.8
So hindi po sign of disrespect.
02:07.5
Pag binangit ko lang po BBM.
02:09.5
Example, si Pope Francis.
02:11.1
Kung nagbasa po kayo muna ng mga stories sa wires, for instance,
02:16.4
Pope Francis, pwedeng Francis lang.
02:19.6
Hindi po ibig sabihin binabastos.
02:21.2
Kasi yung muna si Trump, si Obama, si Biden, di ba, hindi po kabastosan pag ginamit yung term.
02:27.1
Pinapaliwanag ko lang po ulit, okay?
02:30.1
Back to regular programming, Jen.
02:32.1
Nandiyan ka pa ba? Sorry ah.
02:33.1
Dito sa podcast ito kasi kailangan ipaliwanag yung mga ganyang bagay na napakasimple.
02:39.1
So clearly, yung ganyan, ano yung mechanism ng Facebook?
02:44.1
Kasi I suppose this won't be the first and definitely this won't be the last
02:50.1
when it comes to deep take videos or audio that can be used to try to manipulate public sentiment.
02:56.1
Ano yung ginagawa ng ano dyan ng meta?
02:59.1
Yes. So, first of all, I would like to confirm what you said, Christian, na ginagamit nga ito sa elections.
03:07.1
And we've seen that already. Deepfakes being used in elections.
03:11.1
I think important rin na pag-usapan, ano ba yung masama sa deepfakes, right?
03:16.1
Kasi pwede namang sinishare mo or pinupost mo yung deepfake, not because meron kang sinister
03:24.1
or meron kang bad intention, pero pwedeng kinukondena mo.
03:29.1
Or sinishare mo lang to raise awareness na itong klaseng technology na ito, meron na. Diba?
03:35.1
So, I think mahalaga na isipin rin natin ano ba yung purpose kung bakit sinishare yung post na yun.
03:41.1
Yun yung mas mahirap, diba, na paano mo malalaman ano yung totoong purpose kung bakit sinishare ang post.
03:49.1
And we can talk about that, about this information later on.
03:53.1
Anong ginagawa ng platforms?
03:57.1
So, ang best practice ngayon na even policy makers in the EU and in the US are pushing big tech companies to do is to label.
04:07.1
Label the manipulated media.
04:11.1
Kasi as I said, walang inherently masama sa manipulated media.
04:19.1
Walang inherently masama sa isang photorealistic.
04:22.1
Or photorealistic image or realistic video or audio. Diba?
04:28.1
Depende kung ano yung nakalagay doon.
04:31.1
So, depende kung ano yung content ng video or audio na yun.
04:34.1
Pero at the very least, kailangan nakalagay made with AI or ilabel na ginawa siya gamit ang AI.
04:43.1
So, not necessarily removal ang solusyon sa manipulated media. Kasi depende kung ano yung content ng media.
04:51.1
Ito example, kunwari.
04:52.1
Ayan, I understand yung ano yun.
04:54.1
Kasi kailangan may wide latitude when it comes to tolerating AI din.
04:58.1
Kasi baka naman hindi malicious, may isang katuwaan namin. Diba?
05:00.1
Pero dito, kunwari.
05:02.1
Pinake yung boses ng isang head of state or head of government.
05:05.1
Tapos yung mukha niya, kunwari.
05:08.1
Pinagmukan siya yung talagang tao doon sa image na yun.
05:11.1
Tapos sinasabi niya, I want to wage war on China.
05:14.1
Nearly nakakatakot yun. Diba?
05:16.1
Because baka may isang lunatic or siraulong maniwala. Diba?
05:19.1
Yung ganyan, ano yung specific?
05:20.1
Yung ganyan, ano yung specific policy pag gano'n na yung level ng disinformation?
05:24.1
So kapag gano'n, because it risks imminent harm to physical safety or violence or war,
05:34.1
dapat yung tanggalin yun.
05:36.1
Kasi meron na siyang specific harm.
05:38.1
Tinignan natin yung content ng manipulated media.
05:41.1
It could be interpreted as incitement to violence.
05:45.1
Tatanggalin yun for violating other content policies.
05:50.1
Of the big tech platforms.
05:53.1
Motu proprio pwedeng gawin ng meta ng Facebook for instance?
05:57.1
Or someone needs to file a complaint for that to be removed?
06:02.1
Motu proprio pwedeng gawin yun ng meta.
06:06.1
Or any other platform actually.
06:09.1
Kasi sa YouTube, tinignan ko sabi ko, bigla nawala.
06:12.1
Kasi nung unang pinanood ko siya, nandun pa eh.
06:15.1
Parang the following, after a day, one day, nawala siya.
06:19.1
So pwede rin pala, for instance, sa meta.
06:22.1
So can you talk about the experience that you had?
06:26.1
Pinag-aaralan mo yung impact ng deepfakes or disinformation in the context of elections?
06:32.1
Para ginagamit yung mga digital platforms.
06:34.1
Ano ba yung mga scary lessons that we can draw from your observations?
06:40.1
You covered yung Myanmar, tama?
06:42.1
Kasi mag-eelection tayo dito sa Pilipinas sa 2025.
06:46.1
Ano ba yung mga dapat natin bantayan?
06:49.1
So yung naiisip ko na related sa elections, merong news dito.
06:56.1
Earlier this year, nagkaroon ng primaries.
06:59.1
Kasi di ba magkakaroon din ang US presidential election this year.
07:02.1
And isang part ng proseso na yun is yung primary for nominating a presidential candidate.
07:10.1
And then sa isang state dito, sa New Hampshire, parang nagkaroon ng audio deepfake na yung presidente, President Biden,
07:18.1
he was encouraging voters not to vote for the primary.
07:25.1
Parang disenfranchisement ang mangyayari.
07:28.1
Yes. So magkakaroon ng voter suppression.
07:30.1
So yun ang isang real threat.
07:32.1
So even here sa US, nangyayari yan.
07:35.1
And in fact, legislators and policymakers are looking into the issue.
07:39.1
Kasi isipin mo yun, Christian, kung sinusuportahan mo yung presidente,
07:46.1
and sinabi niya, naniwala ka na huwag kang bumoto, hindi ka bumoto.
07:50.1
So mananalo yung kabilang political party.
07:54.1
So talagang, yung threat to democratic processes posed by deepfakes is real.
08:01.1
Oo nga yun. Tsaka anong pa nga yun? Medyo benign pa yan, huwag lang kayong bumoto.
08:06.1
Basically, sabi mo, voter suppression or disenfranchisement.
08:10.1
Pero paano pa kung sinabi niya, nagbibidraw na ako from...
08:14.1
I don't know, any candidate.
08:16.1
For instance, may isang kandidato for next year saying, I'm withdrawing my candidacy.
08:20.1
Yung nga, yung problema, di ba, ang isa sa mga issues natin dito sa nangyayaring disinformation,
08:26.1
once it's out, it's very difficult to go after it.
08:30.1
Di ba, masyadong mabilis yung spread.
08:32.1
Meron siya sariling velocity eh.
08:34.1
So in terms of policy, coming from Meta, how do you try to bridge that gap?
08:39.1
Or narrow that gap?
08:41.1
So, again, we have to look at the context.
08:44.1
May distinction kasi yun eh, kung ano yung nilalaman nung video or nung audio.
08:49.1
Kasi manipulated media in itself is not harmful.
08:54.1
Depende kung ano yung nakalagay dun, di ba?
08:56.1
So, may distinction yan.
08:58.1
If yung content niya, as we have discussed, pinipilit niya yung mga tao huwag bumoto,
09:06.1
or kunyari, sabihin niya, walang...
09:09.1
Hindi ito yung voting...
09:11.1
Actually, Christian, it can be as simple as,
09:13.1
sarado yung polling place, or sarado yung lugar kung saan kayo buboto,
09:20.1
so bukas na lang, pero today na yung last day.
09:23.1
It can be as simple as that.
09:27.1
So yung ganun, a threat to democratic processes,
09:31.1
it will violate most of the big tech platforms' other content policies on voter...
09:38.1
Tinatawag nilang voter or census interference.
09:41.1
So sa meta, yan yung policy.
09:43.1
Voter or census interference.
09:45.1
Paano yan? Meron bang proactive mga tao dito, approach ng meta dyan?
09:52.1
They take elections very seriously.
09:54.1
So meron silang...
09:59.1
Maraming factors kung ano yung tamang response,
10:05.1
tamang set of responses for a particular election.
10:08.1
Pero usually, they set up...
10:10.1
yung tinatawag nilang election operation center.
10:13.1
So they could respond to threats and challenges concerning elections in real time.
10:19.1
So merong nilalaan na resources for that.
10:22.1
Kung nagustuhan niyo po itong video na ito,
10:24.1
pakilike at share na lang po para mas paraya pa makakita at makapanood.
10:28.1
Kung gusto niyo po makibalita, follow niyo na rin po ako sa aking mga social media accounts na nakasulat po sa screen.
10:34.1
Pwede rin po kayo magpadala ng super thanks, super likes, super chat, at super stickers sa YouTube,
10:39.1
saka Facebook stars po.
10:41.1
Alam niyo po, malayo po mararating ito para makagawa pa po kami ng mas maraming video
10:45.1
na makatutulong po sa pagpapakalat ng tamang impormasyon para sa bawat Pilipino.
10:50.1
Kada like, subscribe, follow, at share ay pagsuporta po sa tunay na independent journalism sa bansa.
10:56.1
Maraming maraming salamat po.