vidyo

vidyo link

Action Items:

Fill the list sent by Federica marking the datasets that your are currently using. WITHIN END OF THE WEEK.

News:

Welcome to Paul. He is now effectively a CMS Padova member.

Framework e produzione

* cloud (condor) deeply used during xmas break by stefano (to run limit).

  • possible troubles with lustre / ui are probably due to this usage of resources by Stefano.
  • situation should not change for 2-3 weeks.

* old dataset need to be removed from Legnaro-local.

  • Fill the list sent by Federica marking the datasets that your are currently using. WITHIN END OF THE WEEK.
  • Everything before 2015 will be deleted by default unless differently said.

Attivita' di analisi

B physics

* P'5 - proposed solution run full feynman-cousin. it means 400000 fits on toys. single fit takes 3 minutes.

  • result presented to bPag on tuesday.
  • preliminary result show that unc obtained are mostly identical to the one obtained with simpler method. plus the unc is bigger than the result difference between the phy conv were present to the meeting and comment has been 'this is insane, why did you do it?' bob was there... he said 'decision was taken by others' *now, direct interaction with phy coordinator and stat commitee really do not care more about it.
  • plan: finish on half more toys, another 2 weeks of running.

* next 2 weeks - work to setup trigger menu for this year.

  • nicola and marina agreed on working on that.
  • start from last year path and do checks.

* Paolo: going on with check on miniaod (see if we can survive without aod)

Dark Matter

* 2015 analysis is now carried out by Siew. he is moving to Fermilab (from March).
  • Jacopo (Brown Univ) will keep going on in this topic and supervise his work.
  • summer is target to close work on 2016 data.
  • DM discussion in next CMS Italia meeting.

Dibosoni, VZ risonante (V=W,Z)

* formal approval of B2G analysis (ICHEP dataset) for VZ just before xmas.
  • not much comments from conveners and 'public'
  • Sharam claims that he wants to push the button (make PAS) only before the next conference ()
  • we do not want to spend additional effort on it. Final goal is having paper for Moriond with entire dataset.
* plan for Moriond:
  • include Lisa's work - invisible channel
    • AN produced and comments received
    • study to check if btag categorization could help
    • meeting with alberto to plan combination with other channel
  • request is to go with paper on full dataset but it has to be adapted to current situation.
* Paul will start helping on ALPHA framework and looking at di-boson analysis status.

hh4b non risonante

* status report to hbb just before xmas. no expected limit since mva was not finalized.
  • plan to show a full status report next Wed to hbb and to ask for cadi line. we would like to have paper separated from resonant hh4b search, already asked to conveners but no clear answer received yet.
  • working on fit strategy and mva optimization. next week we will move on full dataset and new mc.
  • pablo is in Padova for one month. possible criticism on man power later.

Same sign leptons

* latest version of AN circulated before xmas.
  • another reviewer joined the flow.
  • no particular show stoppers. plan to deliver 4th version within next week.
  • plan is to move to re-reco and new MC to proceed on full datasets.

Seesaw a 13 TeV

EPR

Isolamento muoni

Flavor tagging

Trigger

Detector

* next 2 Thursday there will be reports on activities.

A.O.B.:

-- FrancoSimonetto - 2017-01-11

Edit | Attach | Watch | Print version | History: r2 < r1 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r2 - 2017-01-12 - MartinoDallOsso
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    Main All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback