• 2017-01-09 13:37:44> @U2PFHNN3C: <@U2PFHNN3C> has joined the channel
  • 2017-01-09 13:37:45> @U2PFHNN3C: <@U2PFHNN3C> set the channel purpose: IA related questions
  • 2017-01-09 13:37:45> @U0B47KC3S: <@U0B47KC3S> has joined the channel
  • 2017-01-09 13:37:45> @U37GZRZU6: <@U37GZRZU6> has joined the channel
  • 2017-01-09 13:37:45> @U1PKXQVDW: <@U1PKXQVDW> has joined the channel
  • 2017-01-09 13:38:17> @U2PFHNN3C: Really interesting paper on #denoising https://arxiv.org/abs/1701.01698
  • 2017-01-09 15:12:00> @U0B47KC3S: hi there, are you all real here ? (reactions: @U37GZRZU6)
  • 2017-01-09 16:35:28> @U37GZRZU6: <@U0B47KC3S> is it a philosophical question ? :grinning:
  • 2017-01-09 17:51:08> @U0B47KC3S: yes otherwise can you prove that through concrete physics :wink:
  • 2017-01-09 18:16:12> @U2PFHNN3C: from within the answer must come, young padawan
  • 2017-01-09 18:16:41> @U2PFHNN3C: if I say Yes, would it encode any information whatsoever?
  • 2017-01-09 18:17:34> @U37GZRZU6: <@U0B47KC3S> <@U2PFHNN3C> are you both taking drugs? can I do drugs with you? it seems fun :smile:
  • 2017-01-09 18:19:23> @U0B47KC3S: <@U37GZRZU6> the same as him
  • 2017-01-09 18:19:24> @U0B47KC3S: https://www.youtube.com/watch?v=JulmkVVfyDA (reactions: @U37GZRZU6)
  • 2017-01-09 18:20:35> @U37GZRZU6: hahaha love it so much !!! perfect :slightly_smiling_face:
  • 2017-01-09 18:21:07> @U37GZRZU6: it will be my new philosophy of living... "the yes needs the no to win"
  • 2017-01-09 21:35:44> @U37GZRZU6: <@U0B47KC3S> je répondrais à ta question métaphysico-philosophique par une autre question : notre expérience de la réalité est-elle seulement réelle? :smiley: https://www.ted.com/talks/donald_hoffman_do_we_see_reality_as_it_is
  • 2017-01-12 00:18:33> @U3PLYAJPJ: <@U3PLYAJPJ> has joined the channel
  • 2017-01-16 22:35:52> @U0AAL4W13: <@U0AAL4W13> has joined the channel
  • 2017-01-17 15:45:11> @U3T7KBEMV: <@U3T7KBEMV> has joined the channel
  • 2017-01-19 20:17:51> @U3TUWV3SQ: <@U3TUWV3SQ> has joined the channel
  • 2017-01-24 10:15:50> @U0AAL4W13: Seems fantastic :) http://www.forbes.com/forbes/welcome/?toURL=http://www.forbes.com/sites/bernardmarr/2017/01/20/first-fda-approval-for-clinical-cloud-based-deep-learning-in-healthcare/
  • 2017-01-24 10:16:28> @U0AAL4W13: First FDA Approval For Clinical Cloud-Based Deep Learning In Healthcare
  • 2017-01-26 17:16:30> @U3WRNP30B: <@U3WRNP30B> has joined the channel
  • 2017-01-30 11:54:23> @U3Y2FPGBV: <@U3Y2FPGBV> has joined the channel
  • 2017-01-30 16:49:01> @U3XHSAQHE: <@U3XHSAQHE> has joined the channel
  • 2017-02-04 13:50:14> @U41049CQ2: <@U41049CQ2> has joined the channel
  • 2017-02-07 15:53:42> @U42P4AT7Z: <@U42P4AT7Z> has joined the channel
  • 2017-02-17 18:29:37> @U2PFHNN3C: Detecting skin cancer with a state-of-the-art neural network, https://www.youtube.com/watch?v=toK1OSLep3s with an android demo at the end
  • 2017-02-17 18:52:55> @U2PFHNN3C: Again, SotA deep model for retinal disease detection https://www.youtube.com/watch?v=oOeZ7IgEN4o
  • 2017-02-17 18:54:17> @U2PFHNN3C: So, again, what you need: - huge data - potentially crowd-source the labels (being careful with the variations in the diagnosis among practitioners) - big infrastructure to learn these big models (reactions: @U0B47KC3S,@U37GZRZU6)
  • 2017-02-17 18:54:42> @U2PFHNN3C: (I’m not including: good library and necessary skills. We have them already ^_^)
  • 2017-02-17 18:55:57> @U2PFHNN3C: Now it doesn’t mean it works out of the box, a LOT of tuning and engineering is needed to have concrete results beyond mere proof of concepts
  • 2017-02-17 19:16:02> @U0B47KC3S: hi <@U2PFHNN3C> it s astounding :wink: could we have a call next week to prepare the next campaign: funds for post-doc + aphp imagery data
  • 2017-02-17 19:21:43> @U2PFHNN3C: Sure! Whenever you want, starting from 6pm
  • 2017-02-17 19:33:48> @U0B47KC3S: let’s say thursday ?
  • 2017-02-17 19:38:27> @U2PFHNN3C: Just sent the invite
  • 2017-02-17 19:51:02> @U0B47KC3S: just accepted it :wink: (reactions: @U2PFHNN3C)
  • 2017-02-17 21:09:45> @U37GZRZU6: <@U2PFHNN3C> <@U0B47KC3S> can I invite myself in your call as a spectator? :slightly_smiling_face:
  • 2017-02-18 01:39:33> @U0B47KC3S: hi hi so sure <@U37GZRZU6> :ok_hand:
  • 2017-02-18 02:03:41> @U37GZRZU6: At what time did you plan it? Won't be available before 8:30 on Thursday :confused:
  • 2017-02-18 02:07:43> @U0B47KC3S: we thought about 6pm but maybe <@U2PFHNN3C> can later. for me it is ok :wink:
  • 2017-02-18 17:51:07> @U37GZRZU6: thursday 1:00 pm ?
  • 2017-02-18 18:26:37> @U0B47KC3S: for me it is ok !
  • 2017-02-18 18:28:46> @U37GZRZU6: <@U2PFHNN3C> ? :cat:
  • 2017-02-18 18:29:34> @U2PFHNN3C: Sure (reactions: @U37GZRZU6)
  • 2017-02-24 11:23:18> @U492PCSE9: <@U492PCSE9> has joined the channel
  • 2017-02-27 14:42:34> @U2PFHNN3C: On domain adaptation (transfer learning) https://arxiv.org/pdf/1702.05374.pdf
  • 2017-03-03 00:59:36> @U4CAG5ZFW: <@U4CAG5ZFW> has joined the channel
  • 2017-03-03 00:59:36> @U4DFR8RN3: <@U4DFR8RN3> has joined the channel
  • 2017-03-08 15:28:31> @U2PFHNN3C: Could also be something worth investigating https://arxiv.org/abs/1703.01220
  • 2017-03-08 17:01:51> @U0B47KC3S: mais c bien notre thème ça !
  • 2017-03-08 23:37:20> @U0AAL4W13: You may have seen this: http://blog.kaggle.com/2017/03/08/kaggle-joins-google-cloud/ :)
  • 2017-03-09 00:03:50> @U0B47KC3S: omg this is amazing news - #epidemium <@U04DFTZ7D>
  • 2017-03-14 15:56:51> @U4J138ZTL: <@U4J138ZTL> has joined the channel
  • 2017-03-15 12:19:14> @U2PFHNN3C: Hello <@U0B47KC3S>, je suis en train d’inscrire echOpen en tant que projet officiel au sein de l’infra de ROMEO. L’une des parties à renseigner est un long descriptif (text formatté + images), aurais-tu cela par hasard à portée de mains ? Un logo haute def est aussi requis par ailleurs si cela est possible.
  • 2017-03-15 12:25:08> @U0AAL4W13: ( ping <@U37GZRZU6> et <@U0FN1B8KD> who could be interested in this type of resources and making it available maybe on github when we were discussing )
  • 2017-03-15 12:25:11> @U0FN1B8KD: <@U0FN1B8KD> has joined the channel
  • 2017-03-15 12:39:18> @U0B47KC3S: hi <@U2PFHNN3C> sure I can send it something to you ! btw, for the logo ping <@U04DFTZ7D>
  • 2017-03-15 14:23:03> @U2PFHNN3C: Thanks <@U0B47KC3S> :slightly_smiling_face: I would also need the list of current publications if possible
  • 2017-03-15 19:31:38> @U0FN1B8KD: <@U2PFHNN3C>
  • 2017-03-17 12:23:43> @U2PFHNN3C: Hello <@U04DFTZ7D> est-ce qu’il y a un logo echopen haute def quelque part ?
  • 2017-03-17 15:18:02> @U0AAL4W13: don't hesitate to share all of this in a common, easy to access place :) ping <@U37GZRZU6> too to implement the processes ;) #methodo
  • 2017-03-17 19:29:08> @U0B47KC3S: hello <@U2PFHNN3C> en attendant <@U04DFTZ7D> g trouvé ça
  • 2017-03-17 19:29:25> @U0B47KC3S: <@U0B47KC3S>
  • 2017-03-17 19:29:43> @U2PFHNN3C: Mea maxima culpa <@U0B47KC3S> c’est déjà réglé et ROMEO a enregistré echOpen pour 2017 en principe
  • 2017-03-17 19:29:52> @U2PFHNN3C: Je vais toutefois écrire à Arnaud pour confirmer cela
  • 2017-03-17 19:29:55> @U0B47KC3S: tOp !
  • 2017-03-17 19:30:09> @U2PFHNN3C: J’ai pris la doc de gitbook ainsi que ses (superbes) illustrations
  • 2017-03-17 19:30:20> @U0B47KC3S: nice :wink:
  • 2017-03-20 20:58:39> @U0AAL4W13: Some news on DeepMind and health : http://www.theverge.com/2017/3/16/14932764/deepmind-google-uk-nhs-health-data-analysis :smiley:
  • 2017-03-21 08:11:26> @U2PFHNN3C: https://romeo.univ-reims.fr/projet.php?id=172_Cp7YNPThlL6TGhV (reactions: @U0B47KC3S,@U37GZRZU6,@U0AAL4W13)
  • 2017-03-21 08:13:01> @U2PFHNN3C: Dans la description courte, j’ai glissé que de potentiels challenges pourraient être organisés, nécessitant de ce fait le support de ROMEO
  • 2017-03-21 08:21:22> @U0B47KC3S: great djalel ! :wink:
  • 2017-03-22 08:28:43> @U0AAL4W13: there this one guy not bad in ai, which is resigning from his job to work for the ai community at large, maybe interesting to recruit him?
  • 2017-03-22 08:28:47> @U0AAL4W13: https://medium.com/@andrewng/opening-a-new-chapter-of-my-work-in-ai-c6a4d1595d7b#.k66s2qz2j (reactions: @U37GZRZU6)
  • 2017-03-22 09:11:39> @U2PFHNN3C: whatever this guys touches becomes gold...
  • 2017-03-22 09:15:50> @U2PFHNN3C: Tout le monde est ultra chaud sur l’IA en ce moment… http://www.economie.gouv.fr/files/files/PDF/2017/Rapport_synthese_France_IA_.pdf
  • 2017-03-22 09:19:48> @U0AAL4W13: What is sweet is that he is keen on teaching , and open to develop his community
  • 2017-03-22 09:19:58> @U0AAL4W13: I used to follow his classes on coursera
  • 2017-03-22 09:26:15> @U0B47KC3S: yes why not ping him one of these days !
  • 2017-03-24 11:22:25> @U0B47KC3S: I am currently at a working session between Academy of Sciences and Academy of Medicine
  • 2017-03-24 11:22:29> @U0B47KC3S: This was created in the wake of epidemium, an open challenge on Big Data and oncology that we coordinate with <@U04DFTZ7D>
  • 2017-03-24 11:22:35> @U0B47KC3S: We have just had a presentation of Johan BRAG around progress and prospects in AI in medical imaging #ThunderBlasting :zap::zap::zap:
  • 2017-03-24 11:22:37> @U0B47KC3S: I will feedback you on wednesday, especially on sophisticated mathematics metrics to compare image or to get a digital signature of them :nerd_face:
  • 2017-03-24 17:14:25> @U0B47KC3S: <@U2PFHNN3C> great news, we are eligible for ARF through Université Paris Descartes, our contact will be Daniele Dywan :wink: !
  • 2017-03-24 17:18:59> @U2PFHNN3C: Top :) <@U0B47KC3S>. She's an admin ?
  • 2017-03-24 17:20:50> @U0B47KC3S: yes !
  • 2017-03-24 17:29:19> @U37GZRZU6: What's arf? :)
  • 2017-03-24 19:37:26> @U2PFHNN3C: no more registrations at ICLR :((((((((((
  • 2017-03-24 19:43:21> @U37GZRZU6: Seriously?? I hope the secretary did it on time for us :astonished:
  • 2017-03-28 21:35:24> @U2PFHNN3C: So here is how a “knowledge base” that would allow us to work collaboratively looks like http://134.158.74.234:8080/feed
  • 2017-03-28 21:36:02> @U2PFHNN3C: the workflow is: you write a jupyter notebook, Rmd, or some markdown notes, you push that into your repo, you send a PR and it’s merged into the common knowledge base
  • 2017-03-28 22:12:37> @U0B47KC3S: excellent <@U2PFHNN3C> :wink:
  • 2017-03-28 22:41:18> @U0AAL4W13: Looks like a mix of blog posts and commits - fun :smiley:
  • 2017-03-28 22:47:23> @U2PFHNN3C: yeah, it’s a hack mixing git, flask, and notebooks
  • 2017-03-28 22:47:55> @U2PFHNN3C: flask is connected to a database so it can be updated on the fly, it allows comments, tags etc
  • 2017-03-28 22:48:05> @U2PFHNN3C: and even mail subscription I think
  • 2017-03-28 23:33:25> @U2PFHNN3C: set up a reminder “speed meeting every Tuesday at 18:30” in this channel at 9AM every Tuesday, Central European Summer Time.
  • 2017-03-28 23:34:32> @U2PFHNN3C: set up a reminder about “Weekly speed meeting” in this channel at 6:30PM every Tuesday, Central European Summer Time. (reactions: @U0B47KC3S)
  • 2017-04-04 18:30:00> @USLACKBOT: Reminder: Weekly speed meeting.
  • 2017-04-04 18:33:30> @U2PFHNN3C: Who’s in?
  • 2017-04-04 18:38:48> @U37GZRZU6: <@U2PFHNN3C> we're available, do you have a link for the hangout ?
  • 2017-04-04 18:40:12> @U37GZRZU6: we joined last week's call : https://hangouts.google.com/hangouts/_/calendar/ZGphbGVsLmJlbmJvdXppZEBnbWFpbC5jb20._6933adpj88qjcb9i6op3ib9k651j8b9o6srk6b9i89244e1k8913ag9g8c?authuser=0
  • 2017-04-04 18:40:53> @U2PFHNN3C: Good, this is what I had copied anyway
  • 2017-04-04 18:43:54> @U37GZRZU6: <@U2PFHNN3C> where are you ? :hatching_chick:
  • 2017-04-04 18:44:07> @U2PFHNN3C: vous me voyez pas?
  • 2017-04-04 19:08:55> @U2PFHNN3C: https://www.youtube.com/watch?v=MHTizZ_XcUM&feature=youtu.be
  • 2017-04-04 19:17:13> @U2PFHNN3C: could be useful to the phantom/aquarium
  • 2017-04-10 17:35:25> @U37GZRZU6: Basic medical image analysis in python + links to potentially useful websites, including dicom images library https://www.kdnuggets.com/2017/03/medical-image-analysis-deep-learning.html#%2EWOZognkqOL0%2Elinkedin
  • 2017-04-10 17:59:21> @U0AAL4W13: Thanks ;)
  • 2017-04-11 16:23:54> @U37GZRZU6: <@U2PFHNN3C> is there a meeting today? corollary question : why did the reminder not work?
  • 2017-04-11 17:12:31> @U2PFHNN3C: <@U37GZRZU6> it rings at 18:30 ;)
  • 2017-04-11 17:13:28> @U37GZRZU6: djabbz [11:33 PM] set up a reminder “speed meeting every Tuesday at 18:30” in this channel at 9AM every Tuesday, Central European Summer Time. :thinking_face:
  • 2017-04-11 17:19:47> @U2PFHNN3C: In this channel: • Remind about “Weekly speed meeting” at 6:30PM every Tuesday. Delete Past and incomplete: • About this message from <@U37GZRZU6> in Complete · Delete · Snooze: 15 mins · 1 hr · Tomorrow Close list (reactions: @U37GZRZU6)
  • 2017-04-11 17:33:09> @U2PFHNN3C: as-tu essayé /reminder list ?
  • 2017-04-11 18:30:00> @USLACKBOT: Reminder: Weekly speed meeting.
  • 2017-04-11 18:31:40> @U2PFHNN3C: <@U37GZRZU6> see? :slightly_smiling_face:
  • 2017-04-11 18:31:47> @U2PFHNN3C: who’s in?
  • 2017-04-11 18:33:21> @U37GZRZU6: it seems that no one's in...
  • 2017-04-11 18:33:27> @U2PFHNN3C: only the two us?
  • 2017-04-11 18:33:32> @U2PFHNN3C: call on the phone?
  • 2017-04-11 18:34:38> @U2PFHNN3C: it shouldn’t be long. My main points: - I attended a student presentation today which tackled ultrasound images with weak supervision and I’m happy about that - we urgently need labeled data :slightly_smiling_face:
  • 2017-04-11 18:35:13> @U37GZRZU6: ping <@U0B47KC3S> :smile:
  • 2017-04-11 18:38:20> @U2PFHNN3C: Bref, ça servira de confcall asynchrone… : my take home message: visualization is essential. The interface must give some feedback to the practitioner, telling him why it’s saying there is some anomaly etc.
  • 2017-04-11 18:38:26> @U0B47KC3S: hi there, I won’t be available for that meeting. However, I am trying to connect to Paris Descartes for the ARF application. Btw, Aurelie and I have a meeting thursday with the datalake guys to know more about image data we can get !
  • 2017-04-11 18:41:32> @U2PFHNN3C: set up a reminder “Just a friendly reminder that there is a meeting later” in this channel at 1PM every weekday, Central European Summer Time.
  • 2017-04-11 18:42:54> @U2PFHNN3C: It seems possible to do train ConvNets with a weak supervision (i.e only the diagnosis for instance) and still ask for a (sort of) segmentation of the image, which is a good news
  • 2017-04-11 18:43:16> @U2PFHNN3C: there is a recent paper (2014) that proposes a method
  • 2017-04-11 18:44:11> @U2PFHNN3C: Second take-home message for me: again, the UX should work hand in hand with the AI team in order to both collect data and give the right feedback to the user, in a principled and ergonomic way
  • 2017-04-11 18:46:08> @U2PFHNN3C: Third take home message (car jamais 2 sans 3, devise des 2b3, auteurs de notre hymne officiel): we need to gather some data quickly, 2000 images min, 10k would be good. Then, if the label is not available, crowdsource it from a committee of doctors. The idea is to have a collegial decision for each image.
  • 2017-04-11 18:46:43> @U2PFHNN3C: Also, the data collection protocol needs to be specified, which anatomical parts, which planes, number of different patients, etc.
  • 2017-04-11 18:48:41> @U37GZRZU6: <@U2PFHNN3C> you made me think (this thought is validated by president <@U0B47KC3S> :upside_down_face: ) that it would be OUFISSIME that you put those kind of remarks (about collecting data etc...) in the product backlog, because actually that's exactly what it stands for ! What do you think ? https://echopen.gitbooks.io/echopen_prototyping/backlog/backlog.html
  • 2017-04-11 18:49:12> @U37GZRZU6: <@U0B47KC3S> <@U3QGT3Q74> are working on the "specs" part I think it should include what you are saying
  • 2017-04-11 18:49:15> @U2PFHNN3C: <@U37GZRZU6> I was thinking about this, thanks for the reminder :slightly_smiling_face: (reactions: @U37GZRZU6)
  • 2017-04-11 18:50:53> @U2PFHNN3C: OUFISSIME? you mean awfsome? (reactions: @U37GZRZU6)
  • 2017-04-11 18:53:44> @U0B47KC3S: <@U2PFHNN3C> got it for your request ! I am going to check what is available under the hood
  • 2017-04-11 18:56:37> @U2PFHNN3C: <@U0B47KC3S> oufsome !
  • 2017-04-11 18:59:09> @U37GZRZU6: <@U2PFHNN3C> what's your github account ?
  • 2017-04-11 18:59:16> @U2PFHNN3C: djalelbbz
  • 2017-04-11 19:07:57> @U2PFHNN3C: thanks <@U37GZRZU6> :wink: (reactions: @U37GZRZU6)
  • 2017-04-12 13:00:00> @USLACKBOT: Reminder: Just a friendly reminder that there is a meeting later. (reactions: @U37GZRZU6)
  • 2017-04-12 13:33:30> @U37GZRZU6: <@U2PFHNN3C> please do something for the reminder :joy:
  • 2017-04-12 13:34:49> @U2PFHNN3C: <@U37GZRZU6> oh my bad, probably out of excessive enthusiasm, I set it up for the whole week… (reactions: @U37GZRZU6)
  • 2017-04-12 15:00:24> @U2PFHNN3C: Just to notice, I’ll be remote working from Paris tomorrow, so I’ll probably come and say hello at the Hotel-Dieu at some point (reactions: @U37GZRZU6,@U0B47KC3S)
  • 2017-04-12 15:03:05> @U37GZRZU6: pour info le matin on est en réunion et l'aprem après 16h aussi, pour le reste je ne connais pas l'agenda personnel de <@U0B47KC3S> :wink:
  • 2017-04-13 22:03:35> @U0AAL4W13: Just discussed with a student who wants to do elastography with neural networks :p
  • 2017-04-14 02:42:26> @U4YF0KAJU: <@U4YF0KAJU> has joined the channel
  • 2017-04-14 17:12:07> @U2PFHNN3C: Just in case you missed this excellent blog post http://distill.pub/2017/momentum/
  • 2017-04-14 17:47:29> @U4YF0KAJU: Hi everyone, glad to be involved here. I have worked previously on applying machine learning to chest radiographs using standard “off the shelf” published models (ie. googlenet). Can see publication here: https://www.ncbi.nlm.nih.gov/pubmed/27922974 (reactions: @U37GZRZU6)
  • 2017-04-14 17:49:07> @U4YF0KAJU: I have thought about applying similar models to ultrasound for the purpose of organ identification. The way I see it (as a radiologist) when we learn to diagnose the very first step is to know what you are looking at. Therefore, one fundamental tool to computer-aided diagnosis with ultrasound will be reliable organ identification.
  • 2017-04-14 17:53:15> @U4YF0KAJU: I am trying to get my hands on some ultrasound data to do this. As I am new to echopen, I don’t know how far along the hardware/software components are and hence the impact of differences in image quality between a medical grade ultrasound image vs. echopen.
  • 2017-04-14 23:50:26> @U2PFHNN3C: Very good initiative <@U4YF0KAJU>! Ultrasound data is apparently not as much prevalent as other medical imaging data. If you have any hint on that, it’d be great
  • 2017-04-14 23:55:20> @U2PFHNN3C: The question of knowledge transfer from different ultrasound devices with different image quality is still an open question to me. IMO the best would be to start something off, with any data available, then use it as weight initialisation to learn from echopen data.
  • 2017-04-14 23:58:17> @U2PFHNN3C: I’ve met a student a few days ago who did his masters thesis on convnets on ultrasound images. He had to collect the data and annotate it himself :confused:
  • 2017-04-15 17:44:13> @U4YF0KAJU: Yeah I agree. I am confident I can get a hold of some data. I think the best place to start would be abdomen and pelvis studies. If obtained from a reputable center, I have an idea on how to (semi-)automatically label the images. All ultrasound images get labelled by the technologists (i.e. Rt Kid, Lt liver, bladder etc). Basically we can use some OCR to extract all the text information on the images, crop them (both to exclude PHI - this will likely be done where I obtain the data, and to make it so the net is not learning based on the embedded text in the image). If the dataset is large enough I have a strong suspicion it will learn something. The segmentation problem is tougher. Do we have any images generated from the echopen? (reactions: @U37GZRZU6,@U0AAL4W13)
  • 2017-04-16 13:45:13> @U2PFHNN3C: We still have the card of crowdsourcing the image labels, even though it’s not the most trivial one. <@U0B47KC3S> if we get our hands on a bunch of US unlabelled images (assume no text to be OCRed), do you think we can organize a “large scale” labelling campaign among doctors?
  • 2017-04-16 13:47:49> @U2PFHNN3C: Segmentation done the right way needs a mask for every image to serve as a label. I agree with you, it’s tougher. This is why I’m currently interested in weak supervision. You can still kinda maximize the input pixels that made your label the most likely, hence having a sort of smooth segmentation to show the doctor with. Plus, it is part of a bigger rule of thumb for the UX: always explain what the AI part is doing.
  • 2017-04-18 00:01:39> @U2PFHNN3C: This is amazing https://arxiv.org/abs/1704.04296
  • 2017-04-18 00:01:55> @U2PFHNN3C: https://twitter.com/jliemansifry/status/854058694408941568
  • 2017-04-18 00:15:37> @U0B47KC3S: <@U2PFHNN3C> ouaou seems incredible :upside_down_face: can’t wait see this on our side. Btw, I am still waiting a response for images. I’ll tell you asap :wink: !
  • 2017-04-18 08:23:33> @U2PFHNN3C: <@U0B47KC3S> thanks mate! (reactions: @U0B47KC3S)
  • 2017-04-18 15:04:24> @U37GZRZU6: hey I won't be available for today's meeting. see you next week :wink:
  • 2017-04-18 15:43:42> @U2PFHNN3C: no problem <@U37GZRZU6>. Who’s up BTW, <@U4J138ZTL> <@U4HTG04JW> <@U4YF0KAJU> ? others ?
  • 2017-04-18 15:43:49> @U4HTG04JW: <@U4HTG04JW> has joined the channel
  • 2017-04-18 15:53:53> @U0B47KC3S: hi <@U2PFHNN3C> one of my target point is to join david reizine to evaulate how we can get images !
  • 2017-04-18 15:58:26> @U37GZRZU6: set up a reminder “:loud_sound: a memo for our weekly meeting on tuesday at 6:30 pm.” in this channel at 2PM every Tuesday, Central European Summer Time.
  • 2017-04-18 16:02:40> @U2PFHNN3C: great! let’s make a memo per person in this channel, just choose different hours :joy:
  • 2017-04-18 16:05:32> @U4YF0KAJU: That is pretty awesome. It was certainly a “low-hanging fruit”. As a resident/fellow doing cardiac MRI, you literally have to do that segmentation process manually on 30 slices x 2 phases x 2 ventricles. Meaning you can get through maybe 3 or 4 cases in a day. There has been automated segmentation algorithms in these software packages for years however if they do not function with > 95% accuracy (i would say) they are useless because they need to be redrawn anyway. It is only a matter of time before novel machine learning algorithms become good enough. Now what will all the cardiac fellows do?!? lol
  • 2017-04-18 16:06:46> @U4YF0KAJU: What kind of images are you trying to obtain? Are you trying to get some from the echopen prototypes?
  • 2017-04-18 16:09:07> @U4YF0KAJU: Is there a calendar for meetings? Where do we usually meet? Some kind of google hangout?
  • 2017-04-18 16:13:37> @U2PFHNN3C: <@U0B47KC3S> for now, our meetings can be asynchronous, slack messages, github cards etc.
  • 2017-04-18 16:32:07> @U37GZRZU6: il n'y avait pas de mémo, tu l'as supprimé la semaine dernière
  • 2017-04-18 16:43:47> @U2PFHNN3C: just to be clear, I’m not saying the weekly meetings are not important :slightly_smiling_face:. I think they’re essential. It’s just that we’re in some sort of “burn-in” phase where things can go quicker if done by smaller groups that sync occasionally.
  • 2017-04-18 16:50:13> @U37GZRZU6: I don't get it : are there weekly meetings or not ?
  • 2017-04-18 17:04:37> @U2PFHNN3C: gonna depend on who’s in. So far no one answered.
  • 2017-04-18 17:31:21> @U2PFHNN3C: Yeah, hangout or whatever (I find hangout more practical AFAIK)
  • 2017-04-18 17:31:30> @U2PFHNN3C: was just kiddin'
  • 2017-04-18 17:33:28> @U2PFHNN3C: <@U37GZRZU6> got a point (offline). We need (strict) weekly meetings. Who’s in today? (we’ll fix the hour once the participants’ list is clear) (reactions: @U37GZRZU6,@U0AAL4W13)
  • 2017-04-18 18:04:29> @U4J138ZTL: Won't be there today :disappointed:
  • 2017-04-18 18:30:00> @USLACKBOT: Reminder: Weekly speed meeting.
  • 2017-04-18 18:42:04> @U2PFHNN3C: no one wants to have a “tête à tête” with me :disappointed: ?
  • 2017-04-18 18:42:27> @U0AAL4W13: Haha
  • 2017-04-18 18:42:28> @U2PFHNN3C: more seriously, <@U4YF0KAJU> do you have a moment this week for a call?
  • 2017-04-18 19:38:19> @U4YF0KAJU: <@U2PFHNN3C> sure. Pretty flexible for time this week so whenever works. I’m in Eastern Standard Timezone. Google Hangouts or skype?
  • 2017-04-19 13:28:40> @U2PFHNN3C: <@U4YF0KAJU> (sorry for the late reply) today at noon (Toronto time) would be good for me. Same for tomorrow. Preference/alternative?
  • 2017-04-19 15:34:43> @U4YF0KAJU: <@U2PFHNN3C> lets do tomorrow Apr 20 at 12pm EST.
  • 2017-04-19 15:35:31> @U2PFHNN3C: <@U4YF0KAJU> Booked. My mail in DM.
  • 2017-04-20 14:46:12> @U0AAL4W13: A future image lake? http://www.thepocusatlas.com/
  • 2017-04-25 14:00:00> @USLACKBOT: Reminder: :loud_sound: a memo for our weekly meeting on tuesday at 6:30 pm.
  • 2017-04-25 16:18:51> @U2PFHNN3C: <!channel> what would be the best day for you for the weekly meeting, Monday or Wednesday? (1800 or 1830) (reactions: @U37GZRZU6)

results matching ""

    No results matching ""