自拍偷在线精品自拍偷,亚洲欧美中文日韩v在线观看不卡

庫(kù)克:人工智能要想真正聰明 必須尊重人類的價(jià)值觀

人工智能
為了保護(hù)數(shù)據(jù)隱私,歐盟于今年5月25日起正式實(shí)施《通用數(shù)據(jù)保護(hù)條例》(GDPR)。這個(gè)被譽(yù)為“史上最嚴(yán)”的隱私保護(hù)法案規(guī)定,違反者將被罰以至少1000萬(wàn)歐元或企業(yè)上一財(cái)年全球營(yíng)業(yè)總額的2%—4%,且以較高者為準(zhǔn)。而庫(kù)克此次正是受邀赴歐參加數(shù)據(jù)隱私主題的國(guó)際會(huì)議。在發(fā)言中,他極力贊揚(yáng)歐盟的新規(guī),并表達(dá)了蘋果公司保護(hù)用戶數(shù)據(jù)隱私的決心。

[[247844]]

近幾年,F(xiàn)acebook、谷歌等互聯(lián)網(wǎng)巨頭紛紛因?yàn)椴划?dāng)使用用戶數(shù)據(jù),破壞隱私安全等丑聞而陷入輿論風(fēng)波。10月24日,蘋果CEO庫(kù)克在布魯塞爾(歐盟總部所在地)發(fā)表了關(guān)于隱私與數(shù)據(jù)安全問(wèn)題的演講, 在表達(dá)支持歐盟隱私保護(hù)政策的同時(shí),他也強(qiáng)調(diào),蘋果公司將始終堅(jiān)持尊重用戶隱私的傳統(tǒng)。

回顧今年上半年,F(xiàn)acebook幾度陷入丑聞。2月,負(fù)責(zé)俄羅斯干擾美國(guó)總統(tǒng)大選的調(diào)查員指控13名俄羅斯人使用Facebook、Twitter和Instagram干涉大選,通過(guò)花錢投放中傷希拉里的廣告來(lái)響選民的投票決定。3月,據(jù)《紐約時(shí)報(bào)》報(bào)道,有研究人員獲得Facebook用戶的個(gè)人信息,并將其出售給特朗普和共和黨雇用的咨詢公司劍橋分析,而這家機(jī)構(gòu)則通過(guò)使用過(guò)使用橋分析,而這家機(jī)。

事實(shí)上,早在2016年大選時(shí),就有人指控Facebook被用于傳播不實(shí)信息,釋放政治誘餌。而《紐約客》的一篇文章則指出,F(xiàn)acebook早在2015年12月就已經(jīng)發(fā)現(xiàn)了這個(gè)問(wèn)題,但卻一直保持沉默,只是在媒體曝光之后才站出來(lái)承認(rèn)了這件事。

接連的丑聞使得Facebook的股價(jià)大跌,市值一度縮水千億美元。與此同時(shí),政府也在加強(qiáng)對(duì)這些互聯(lián)網(wǎng)巨頭的監(jiān)管和懲罰。

為了保護(hù)數(shù)據(jù)隱私,歐盟于今年5月25日起正式實(shí)施《通用數(shù)據(jù)保護(hù)條例》(GDPR)。這個(gè)被譽(yù)為“史上最嚴(yán)”的隱私保護(hù)法案規(guī)定,違反者將被罰以至少1000萬(wàn)歐元或企業(yè)上一財(cái)年全球營(yíng)業(yè)總額的2%—4%,且以較高者為準(zhǔn)。

而庫(kù)克此次正是受邀赴歐參加數(shù)據(jù)隱私主題的國(guó)際會(huì)議。在發(fā)言中,他極力贊揚(yáng)歐盟的新規(guī),并表達(dá)了蘋果公司保護(hù)用戶數(shù)據(jù)隱私的決心。

以下為庫(kù)克演講的摘譯:

Around the world, from Copenhagen to Chennai to Cupertino, new technologies are driving breakthroughs in humanity’s greatest common projects。 From preventing and fighting disease…To curbing the effects of climate change…To ensuring every person has access to information and economic opportunity。

在世界各地,從哥本哈根到金奈到庫(kù)比蒂諾,新技術(shù)正在推動(dòng)人類最偉大的共同項(xiàng)目取得突破。從預(yù)防和對(duì)抗疾病,到控制氣候變化的影響,再到確保每個(gè)人都能獲得信息和經(jīng)濟(jì)機(jī)會(huì)。

At the same time, we see vividly—painfully—how technology can harm rather than help。 Platforms and algorithms that promised to improve our lives can actually magnify our worst human tendencies。 Rogue actors and even governments have taken advantage of user trust to deepen divisions, incite violence, and even undermine our shared sense of what is true and what is false。

與此同時(shí),我們也看到了技術(shù)是如何帶來(lái)傷害而不是幫助,這是非常痛苦的。承諾改善我們生活的平臺(tái)和算法實(shí)際上放大了我們最壞的人性傾向。流氓行為者甚至政府正在利用用戶信任加深分歧,煽動(dòng)暴力,甚至破壞我們對(duì)何為真假的共識(shí)。

This crisis is real。 It is not imagined, or exaggerated, or “crazy。” And those of us who believe in technology’s potential for good must not shrink from this moment.Now, more than ever — as leaders of governments, as decision-makers in business, and as citizens — we must ask ourselves a fundamental question: What kind of world do we want to live in?

這場(chǎng)危機(jī)是真實(shí)的。它不是想象出來(lái)的,不是夸張的,也不是“瘋狂的”。“我們當(dāng)中那些相信科技有潛力帶來(lái)好處的人,不能在這一刻退縮。現(xiàn)在,作為政府領(lǐng)導(dǎo)人、商界決策者和公民,我們比以往任何時(shí)候都更必須問(wèn)自己一個(gè)根本問(wèn)題:我們希望生活在一個(gè)什么樣的世界?

I‘m here today because we hope to work with you as partners in answering this question.At Apple, we are optimistic about technology’s awesome potential for good。 But we know that it won’t happen on its own。 Every day, we work to infuse the devices we make with the humanity that makes us。 As I’ve said before, “Technology is capable of doing great things。 But it doesn’t want to do great things。 It doesn’t want anything。 That part takes all of us。

我今天來(lái)到這里,是因?yàn)槲覀兿Mc你們一起,共同回答這個(gè)問(wèn)題。在蘋果,我們對(duì)科技的巨大潛力持樂(lè)觀態(tài)度。但我們知道,這不會(huì)自行發(fā)生。每一天,我們都在努力將我們制造的設(shè)備與制造我們的人性融合在一起。正如我之前說(shuō)過(guò)的,“技術(shù)可以做偉大的事情,但它不會(huì)想去做偉大的事情。它不想要任何東西。這一部分占據(jù)了我們所有人。”

That’s why I believe that our missions are so closely aligned。 As Giovanni puts it, “We must act to ensure that technology is designed and developed to serve humankind, and not the other way around.We at Apple believe that privacy is a fundamental human right。 But we also recognize that not everyone sees things as we do。 In a way, the desire to put profits over privacy is nothing new。

這就是為什么我認(rèn)為我們的使命是如此緊密地結(jié)合在一起。正如喬瓦尼所說(shuō),“我們必須采取行動(dòng),確保技術(shù)的設(shè)計(jì)和開(kāi)發(fā)是為了服務(wù)人類,而不是反過(guò)來(lái)。”我們蘋果公司認(rèn)為隱私是一項(xiàng)基本人權(quán)。但我們也認(rèn)識(shí)到,并不是每個(gè)人都像我們一樣看待事物。在某種程度上,將利潤(rùn)置于隱私之上的愿望并不是什么新鮮事。

As far back as 1890, future Supreme Court Justice Louis Brandeis published an article in the Harvard Law Review, making the case for a “Right to Privacy” in the United States。 He warned: “Gossip is no longer the resource of the idle and of the vicious, but has become a trade。” Today that trade has exploded into a data industrial complex。 Our own information, from the everyday to the deeply personal, is being weaponized against us

早在1890年,未來(lái)的最高法院大法官路易斯o布蘭迪斯(Louis Brandeis)就在《哈佛法律評(píng)論》(Harvard Law Review)上發(fā)表了一篇文章,為美國(guó)的“隱私權(quán)”辯護(hù)。他警告說(shuō):“流言蜚語(yǔ)不再是懶漢和惡人的資源,而是一種交易。”如今,這一行業(yè)已迅速發(fā)展成為一個(gè)數(shù)據(jù)產(chǎn)業(yè)綜合體。我們自己的信息,從日常生活到個(gè)人生活,都在以軍事效率變成對(duì)付我們的武器。

Every day, billions of dollars change hands, and countless decisions are made, on the basis of our likes and dislikes, our friends and families, Our relationships and conversations…Our wishes and fears…Our hopes and dreams。 These scraps of data…each one harmless enough on its own…are carefully assembled, synthesized, traded, and sold。

每天,數(shù)十億美元被交易,無(wú)數(shù)的決定被做出,這些決定基于我們的好惡,我們的朋友和家人,我們的關(guān)系和談話,我們的愿望和恐懼,我們的希望和夢(mèng)想。”這些零碎的數(shù)據(jù),每一個(gè)單獨(dú)來(lái)看似乎都不足為害,都經(jīng)過(guò)仔細(xì)組裝、合成、交易和出售。

Taken to its extreme, this process creates an enduring digital profile and lets companies know you better than you may know yourself。 Your profile is then run through algorithms that can serve up increasingly extreme content, pounding our harmless preferences into hardened convictions。 If green is your favorite color, you may find yourself reading a lot of articles—or watching a lot of videos—about the insidious threat from people who like orange.In the news, almost every day, we bear witness to the harmful, even deadly, effects of these narrowed world views。

從極端的角度看,這個(gè)過(guò)程創(chuàng)造了一個(gè)持久的數(shù)字檔案,讓公司更好地了解你,甚至比你自己更了解。然后,你的個(gè)人資料將通過(guò)算法運(yùn)行,算法會(huì)為你提供越來(lái)越極端的內(nèi)容,把我們無(wú)害的偏好變成堅(jiān)定的信念。如果綠色是你最喜歡的顏色,你可能會(huì)發(fā)現(xiàn)自己讀了很多關(guān)于喜歡橙色的人的潛在威脅的文章,或者看了很多類似的視頻。在新聞中,幾乎每一天,我們都見(jiàn)證了這些狹隘世界觀的有害甚至致命的影響。

We shouldn’t sugarcoat the consequences。 This is surveillance。 And these stockpiles of personal data serve only to enrich the companies that collect them。

我們不應(yīng)該粉飾后果。這是監(jiān)視。而這些被儲(chǔ)存的個(gè)人數(shù)據(jù)只會(huì)讓收集這些數(shù)據(jù)的公司從中獲利。

This should make us very uncomfortable。 It should unsettle us。 And it illustrates the importance of our shared work and the challenges still ahead of us。

這會(huì)讓我們非常不舒服,使我們不安。它表明了擺在我們面前的挑戰(zhàn)以及共同努力的重要性。

Fortunately, this year, you’ve shown the world that good policy and political will can come together to protect the rights of everyone。 We should celebrate the transformative work of the European institutions tasked with the successful implementation of the GDPR。 We also celebrate the new steps taken, not only here in Europe, but around the world。 In Singapore, Japan, Brazil, New Zealand, and many more nations, regulators are asking tough questions and crafting effective reforms.It is time for the rest of the world—including my home country—to follow your lead。

幸運(yùn)的是,今年,你們(歐盟)向世界展示了良好的政策和政治意愿能夠共同保護(hù)每個(gè)人的權(quán)利。我們應(yīng)該慶祝歐洲機(jī)構(gòu)為成功實(shí)施GDPR而進(jìn)行的改革工作。我們還慶祝不僅在歐洲,而且在全世界采取的新步驟。在新加坡、日本、巴西、新西蘭和其他許多國(guó)家,監(jiān)管機(jī)構(gòu)正在提出尖銳的問(wèn)題,并制定有效的改革方案。是時(shí)候讓全世界,包括我的祖國(guó),跟隨你們的腳步了。

We at Apple are in full support of a comprehensive federal privacy law in the United States。 There, and everywhere, it should be rooted in four essential rights: First, the right to have personal data minimized。 Companies should challenge themselves to de-identify customer data—or not to collect it in the first place。 Second, the right to knowledge。 Users should always know what data is being collected and what it is being collected for。 This is the only way to empower users to decide what collection is legitimate and what isn’t。 Anything less is a sham。 Third, the right to access。 Companies should recognize that data belongs to users, and we should all make it easy for users to get a copy of…correct…and delete their personal data。 And fourth, the right to security。 Security is foundational to trust and all other privacy rights。

蘋果公司完全支持美國(guó)全面的聯(lián)邦隱私法。無(wú)論在哪里,它都應(yīng)該植根于四項(xiàng)基本權(quán)利:第一,最小化個(gè)人數(shù)據(jù)的權(quán)利。企業(yè)應(yīng)該挑戰(zhàn)自己,反識(shí)別客戶數(shù)據(jù),或者一開(kāi)始就不收集。第二,知情權(quán)。用戶應(yīng)該始終知道哪些數(shù)據(jù)被收集以及收集這些數(shù)據(jù)的目的是什么。這是授權(quán)用戶決定哪些收集合法而哪些不合法的唯一方法。任何告知的不足都是一種欺騙。第三,使用權(quán)。公司應(yīng)該認(rèn)識(shí)到數(shù)據(jù)是屬于用戶的,我們應(yīng)該讓用戶很容易拷貝、修正并刪除他們的個(gè)人數(shù)據(jù)。第四,保障權(quán)。安全是信任和其他所有隱私權(quán)的基礎(chǔ)。

Now, there are those who would prefer I hadn’t said all of that。 Some oppose any form of privacy legislation。 Others will endorse reform in public, and then resist and undermine it behind closed doors。 They may say to you, ‘our companies will never achieve technology’s true potential if they are constrained with privacy regulation。’ But this notion isn’t just wrong, it is destructive。

現(xiàn)在,有些人更希望我不要說(shuō)出這些。一些人反對(duì)任何形式的隱私立法。另一些人會(huì)在公開(kāi)場(chǎng)合支持改革,然后在私下里抵制并破壞它。他們可能會(huì)對(duì)你說(shuō),“如果我們的公司受到隱私監(jiān)管的約束,就永遠(yuǎn)無(wú)法實(shí)現(xiàn)技術(shù)的真正潛力。”但這種想法不僅是錯(cuò)誤的,而且是破壞性的。

Technology’s potential is, and always must be, rooted in the faith people have in it…In the optimism and creativity that it stirs in the hearts of individuals…In its promise and capacity to make the world a better place.It’s time to face facts。 We will never achieve technology’s true potential without the full faith and confidence of the people who use it。

科技的潛力植根于人們對(duì)它的信仰,根植于它在個(gè)人心中激起的樂(lè)觀和創(chuàng)造力,根植于它使世界變得更美好的承諾和能力。是時(shí)候面對(duì)現(xiàn)實(shí)了。如果沒(méi)有用戶的充分信任和信心,我們永遠(yuǎn)無(wú)法實(shí)現(xiàn)技術(shù)的真正潛力。

At Apple, respect for privacy—and a healthy suspicion of authority—have always been in our bloodstream。 Our first computers were built by misfits, tinkerers, and rebels—not in a laboratory or a board room, but in a suburban garage。 We introduced the Macintosh with a famous TV ad channeling George Orwell‘s 1984—a warning of what can happen when technology becomes a tool of power and loses touch with humanity。

在蘋果,尊重隱私,以及對(duì)權(quán)威的一種健康的懷疑,一直存在于我們的血液中。我們的第一臺(tái)電腦是由不適應(yīng)環(huán)境的人、工匠和叛逆者制造的——不是在實(shí)驗(yàn)室或會(huì)議室里,而是在郊區(qū)的車庫(kù)里。我們1984年為麥金塔電腦 (Macintosh)所做的著名電視廣告借用了喬治·奧威爾(George Orwell)的隱喻,這是對(duì)技術(shù)成為權(quán)力工具、與人類失去聯(lián)系的后果的警告。

And way back in 2010, Steve Jobs said in no uncertain terms: “Privacy means people know what they’re signing up for, in plain language, and repeatedly。”

早在2010年,史蒂夫o喬布斯(Steve Jobs)就曾明確表示:“隱私意味著人們知道自己在注冊(cè)什么,(企業(yè)應(yīng)該)用直白的語(yǔ)言,而且反復(fù)提示(用戶)。”

It’s worth remembering the foresight and courage it took to make that statement。 When we designed this device we knew it could put more personal data in your pocket than most of us keep in our homes。 And there was enormous pressure on Steve and Apple to bend our values and to freely share this information。 But we refused to compromise。 In fact, we’ve only deepened our commitment in the decade since。

值得記住的是,做出這樣的聲明需要遠(yuǎn)見(jiàn)和勇氣。當(dāng)我們?cè)O(shè)計(jì)這個(gè)設(shè)備(iphone)的時(shí)候,我們知道它可以把詳盡的個(gè)人數(shù)據(jù)放進(jìn)你的口袋里。史蒂夫和蘋果承受著巨大的壓力,我們被要求改變價(jià)值觀,自由地分享這些信息。但我們拒絕妥協(xié)。事實(shí)上,從那以后的十年里,我們的承諾不斷深化。

From hardware breakthroughs…that encrypt fingerprints and faces securely—and only—on your device。。。To simple and powerful notifications that make clear to every user precisely what they’re sharing and when they are sharing it。

從硬件上的突破,比如加密指紋和面部的安全,到簡(jiǎn)單而有效的通知,讓每個(gè)用戶都清楚地知道他們?cè)诠蚕硎裁匆约笆裁磿r(shí)候共享它。

We aren’t absolutists, and we don’t claim to have all the answers。 Instead, we always try to return to that simple question: What kind of world do we want to live in。

我們不是絕對(duì)論者,我們也不自稱有全部答案。相反,我們總是試圖回到那個(gè)簡(jiǎn)單的問(wèn)題:我們想要生活在什么樣的世界里?

At every stage of the creative process, then and now, we engage in an open, honest, and robust ethical debate about the products we make and the impact they will have。 That’s just a part of our culture。 We don’t do it because we have to, we do it because we ought to。 The values behind our products are as important to us as any feature。

在創(chuàng)意的每一個(gè)階段,不論是過(guò)去還是現(xiàn)在,我們都會(huì)進(jìn)行一場(chǎng)公開(kāi)、誠(chéng)實(shí)、有力的道德辯論,討論我們生產(chǎn)的產(chǎn)品及其將產(chǎn)生的影響。這只是我們文化的一部分。我們這么做不是因?yàn)槲覀儽仨氝@么做,而是因?yàn)槲覀儜?yīng)該這么做。我們產(chǎn)品背后的價(jià)值觀對(duì)我們來(lái)說(shuō)和任何(產(chǎn)品)特性一樣重要。

We understand that the dangers are real—from cyber-criminals to rogue nation states。 We’re not willing to leave our users to fend for themselves。 And, we‘ve shown, we’ll defend those principles when challenged。 Those values…that commitment to thoughtful debate and transparency…they’re only going to get more important。 As progress speeds up, these things should continue to ground us and connect us, first and foremost, to the people we serve。

我們知道,從網(wǎng)絡(luò)罪犯到流氓國(guó)家,危險(xiǎn)是真實(shí)存在的。我們不愿意讓我們的用戶自謀生路。我們已經(jīng)證明,當(dāng)這些原則受到挑戰(zhàn)時(shí),我們會(huì)捍衛(wèi)它們。這些價(jià)值觀,這些對(duì)深思熟慮的辯論和透明度的承諾,只會(huì)變得越來(lái)越重要。隨著進(jìn)步速度的加快,這些東西應(yīng)該繼續(xù)根植于我們,并首先將我們與我們所服務(wù)的人聯(lián)系起來(lái)。

Artificial Intelligence is one area I think a lot about。 Clearly, it‘s on the minds of many of my peers as well。 At its core, this technology promises to learn from people individually to benefit us all。 Yet advancing AI by collecting huge personal profiles is laziness, not efficiency。 For Artificial Intelligence to be truly smart, it must respect human values, including privacy.If we get this wrong, the dangers are profound。 We can achieve both great Artificial Intelligence and great privacy standards。 It’s not only a possibility, it is a responsibility.In the pursuit of artificial intelligence, we should not sacrifice the humanity, creativity, and ingenuity that define our human intelligence。 And at Apple, we never will。

人工智能是我經(jīng)常思考的一個(gè)領(lǐng)域。很明顯,我的很多同齡人都有這樣的想法。這項(xiàng)技術(shù)的核心是向個(gè)人學(xué)習(xí),讓我們所有人受益。然而,通過(guò)收集大量個(gè)人資料來(lái)提升人工智能是一種懶惰,而非效率。人工智能要想真正聰明,就必須尊重人類的價(jià)值觀,包括隱私。如果我們弄錯(cuò)了,危險(xiǎn)將是深遠(yuǎn)的。我們可以實(shí)現(xiàn)偉大的人工智能和偉大的隱私標(biāo)準(zhǔn)。這不僅是一種可能性,也是一種責(zé)任。在追求人工智能的過(guò)程中,我們不應(yīng)該犧牲定義人類智能的人性、創(chuàng)造力和獨(dú)創(chuàng)性。而在蘋果,我們永遠(yuǎn)不會(huì)。

In the mid-19th Century, the great American writer Henry David Thoreau found himself so fed up with the pace and change of Industrial society that he moved to a cabin in the woods by Walden Pond。 Call it the first digital cleanse。 Yet even there, where he hoped to find a bit of peace, he could hear a distant clatter and whistle of a steam engine passing by。 “We do not ride on the railroad,” he said。 “It rides upon us”。Those of us who are fortunate enough to work in technology have an enormous responsibility。 It is not to please every grumpy Thoreau out there。 That’s an unreasonable standard, and we’ll never meet it。

19世紀(jì)中葉,偉大的美國(guó)作家亨利·戴維·梭羅(Henry David Thoreau)發(fā)現(xiàn)自己對(duì)工業(yè)社會(huì)的節(jié)奏和變化感到厭倦,于是搬到了瓦爾登湖(Walden Pond)旁的樹(shù)林里的一間小木屋,稱之為第一次數(shù)字化凈化。即使在那里,他也能聽(tīng)到遠(yuǎn)處傳來(lái)汽笛聲。“我們不乘坐鐵路,”他說(shuō)。“它騎在我們身上。”我們這些有幸從事技術(shù)工作的人肩負(fù)著巨大的責(zé)任。這并不是要取悅所有脾氣暴躁的梭羅。這是不合理的標(biāo)準(zhǔn),我們永遠(yuǎn)也達(dá)不到。

We are responsible, however, for recognizing that the devices we make and the platforms we build have real…lasting…even permanent effects on the individuals and communities who use them。 We must never stop asking ourselves…What kind of world do we want to live in?

然而,我們有責(zé)任認(rèn)識(shí)到,我們制造的設(shè)備和平臺(tái)對(duì)使用它們的個(gè)人和社區(qū)具有真正持久,甚至永久的影響。我們必須不斷地問(wèn)自己,我們想要生活在什么樣的世界里?

The answer to that question must not be an afterthought, it should be our primary concern。 We at Apple can—and do—provide the very best to our users while treating their most personal data like the precious cargo that it is。 And if we can do it, then everyone can do it。

這個(gè)問(wèn)題的答案不應(yīng)該是事后才想出來(lái)的,它應(yīng)該是我們主要關(guān)心的問(wèn)題。蘋果可以,也確實(shí)做到了——為我們的用戶提供最好的服務(wù),同時(shí)把他們的個(gè)人數(shù)據(jù)當(dāng)作珍貴的事物對(duì)待。如果我們能做到,那么每個(gè)人都能做到。

Fortunately, we have your example before us。 Thank you for your work…For your commitment to the possibility of human-centered technology…And for your firm belief that our best days are still ahead of us。

幸運(yùn)的是,我們面前有你們(歐盟)作為榜樣。感謝你們的工作,感謝你們對(duì)以人為中心的技術(shù)的可能性的承諾,感謝你們堅(jiān)信我們最好的日子還在前面。

Thank you very much。

非常感謝。

作者為《財(cái)經(jīng)》實(shí)習(xí)生

責(zé)任編輯:未麗燕 來(lái)源: 財(cái)經(jīng)雜志
相關(guān)推薦

2018-07-06 10:47:26

數(shù)據(jù)

2019-05-17 10:08:55

大數(shù)據(jù)IT人工智能

2023-10-16 13:36:36

2018-07-13 14:38:34

人工智能深度學(xué)習(xí)機(jī)器人

2021-01-11 11:04:53

人工智能AI人工智能技術(shù)

2022-07-06 14:51:07

人工智能技術(shù)研究

2021-03-19 10:22:03

人工智能

2022-05-23 15:56:40

人工智能機(jī)器人自然語(yǔ)言

2023-04-06 13:59:50

人工智能系統(tǒng)

2020-10-29 10:27:29

人工智能技術(shù)數(shù)據(jù)

2021-03-30 10:41:48

比特幣加密貨幣貨幣

2021-09-08 14:38:34

開(kāi)放式社會(huì)規(guī)范環(huán)境

2020-07-24 09:39:40

大數(shù)據(jù)信息價(jià)值觀技術(shù)

2023-07-03 09:47:28

2023-10-07 16:23:01

人工智能

2024-02-04 10:23:38

人工智能

2021-02-25 10:23:01

人工智能科技機(jī)器學(xué)習(xí)

2021-07-28 13:29:44

人工智能指令技術(shù)

2017-05-10 11:30:28

人工智能

2021-02-04 15:05:36

人工智能機(jī)器學(xué)習(xí)技術(shù)
點(diǎn)贊
收藏

51CTO技術(shù)棧公眾號(hào)